Divers

  • Acer Releases XBO Series: 28-inch UHD/4K with G-Sync for $800 (AnandTech)

    Monitors are getting exciting. Not only are higher resolution panels becoming more of the norm, but the combination of different panel dimensions and feature sets means that buying the monitor you need for the next 10 years is getting more difficult. Today Acer adds some spice to the mix by announcing pre-orders for the XB280HK – a 28-inch TN monitor with 3840x2160 resolution that also supports NVIDIA’s G-Sync to reduce tearing and stuttering.

    Adaptive frame rate technologies are still in the early phases for adoption by the majority of users. AMD’s FreeSync is still a few quarters away from the market, and NVIDIA’s G-Sync requires an add-in card which started off as an interesting, if not expensive, monitor upgrade. Fast forward a couple of months and as you might expect, the best place for G-Sync to go is into some of the more impressive monitor configurations. 4K is becoming a go-to resolution for anyone with deep enough wallets, although some might argue that the 21:9 monitors might be better for gaming immersion at least.

    The XB280HK will support 3840x2160 at 60 Hz via DisplayPort 1.2, along with a 1 ms gray-to-gray response time and a fixed frequency up to 144 Hz. The stand will adjust up to 155mm in height with 40º of tilt. There is also 120º of swivel and a full quarter turn of pivot allowing for portrait style implementations. The brightness of the panel is rated at 300 cd/m2, with an 8 bit+HiFRC TN display that has a typical contrast ratio of 1000:1 and 72% NTSC. VESA is also supported at the 100x100mm scale, as well as a USB 3.0 Hub as part of the monitor, although there are no monitor speakers.

    The XB280HK is currently available for pre-order in the UK at £500, but will have a US MSRP of $800. Also part of the Acer XBO range is the XB270H, a 27-inch 1920x1080 panel with G-Sync with an MSRP of $600. Expected release date, according to the pre-orders, should be the 3rd of October.

    Source: Acer

  • iFixit démonte l'iPhone 6 plus (MacBidouille)

    Ifixit a mis la main sur un iPhone 6 plus et l'a démonté.

    Bonne nouvelle, globalement son démontage est aisé. Au niveau des composants on a:

    • Une batterie de 2915 mA.h, le double de celle d'un iPhone 5S
    • un très volumineux appareil photo car doté d'une stabilisation optique
    • Le processeur A8 accompagné d'une puce de RAM de 1 Go
    • des antennes inédites

    [MàJ] Ils ont aussi démonté l'iPhone 6.
    https://www.ifixit.com/Teardown/iPhone+6+Teardown/29213

    Là aussi, pas de difficultés particulières pour accéder aux composants à part les vis Pentalobe pour lesquelles on trouve maintenant pour quelques euros des tournevis.

  • Microsoft Details Direct3D 11.3 & 12 New Rendering Features (AnandTech)

    Back at GDC 2014 in March, Microsoft and its hardware partners first announced the next full iteration of the Direct3D API. Now on to version 12, this latest version of Direct3D would be focused on low level graphics programming, unlocking the greater performance and greater efficiency that game consoles have traditionally enjoyed by giving seasons programmers more direct access to the underlying hardware. In particular, low level access would improve performance both by reducing the overhead high level APIs incur, and by allowing developers to better utilize multi-threading by making it far easier to have multiple threads submitting work.

    At the time Microsoft offered brief hints that there would be more to Direct3D 12 than just the low level API, but the low level API was certainly the focus for the day. Now as part of NVIDIA’s launch of the second generation Maxwell based GeForce GTX 980, Microsoft has opened up to the press and public a bit more on what their plans are for Direct3D. Direct3D 12 will indeed introduce new features, but there will be more in development than just Direct3D 12.

    Direct3D 11.3

    First and foremost then, Microsoft has announced that there will be a new version of Direct3D 11 coinciding with Direct3D 12. Dubbed Direct3D 11.3, this new version of Direct3D is a continuation of the development and evolution of the Direct3D 11 API and like the previous point updates will be adding API support for features found in upcoming hardware.

    At first glance the announcement of Direct3D 11.3 would appear to be at odds with Microsoft’s development work on Direct3D 12, but in reality there is a lot of sense in this announcement. Direct3D 12 is a low level API – powerful, but difficult to master and very dangerous in the hands of inexperienced programmers. The development model envisioned for Direct3D 12 is that a limited number of code gurus will be the ones writing the engines and renderers that target the new API, while everyone else will build on top of these engines. This works well for the many organizations that are licensing engines such as UE4, or for the smaller number of organizations that can justify having such experienced programmers on staff.

    However for these reasons a low level API is not suitable for everyone. High level APIs such as Direct3D 11 do exist for a good reason after all; their abstraction not only hides the quirks of the underlying hardware, but it makes development easier and more accessible as well. For these reasons there is a need to offer both high level and low level APIs. Direct3D 12 will be the low level API, and Direct3D 11 will continue to be developed to offer the same features through a high level API.

    Direct3D 12

    Today’s announcement of Direct3D 11.3 and the new features set that Direct3D 11.3 and 12 will be sharing will have an impact on Direct3D 12 as well. We’ll get to the new features in a moment, but at a high level it should be noted that this means that Direct3D 12 is going to end up being a multi-generational (multi-feature level) API similar to Direct3D 11.

    In Direct3D 11 Microsoft introduced feature levels, which allowed programmers to target different generations of hardware using the same API, instead of having to write their code multiple times for each associated API generation. In practice this meant that programmers could target D3D 9, 10, and 11 hardware through the D3D 11 API, restricting their feature use accordingly to match the hardware capabilities. This functionality was exposed through feature levels (ex: FL9_3 for D3D9.0c capable hardware) which offered programmers a neat segmentation of feature sets and requirements.

    Direct3D 12 in turn will also be making use of feature levels, allowing developers to exploit the benefits of the low level nature of the API while being able to target multiple generations of hardware. It’s through this mechanism that Direct3D 12 will be usable on GPUs as old as NVIDIA’s Fermi family or as new as their Maxwell family, all the while still being able to utilize the features added in newer generations.

    Ultimately for users this means they will need to be mindful of feature levels, just as they are today with Direct3D 11. Hardware that is Direct3D 12 compatible does not mean it supports all of the latest feature sets, and keeping track of feature set compatibility for each generation of hardware will still be important going forward.

    11.3 & 12: New Features

    Getting to the heart of today’s announcement from Microsoft, we have the newly announced features that will be coming to Direct3D 11.3 and 12. It should be noted at this point in time this is not an exhaustive list of all of the new features that we will see, and Microsoft is still working to define a new feature level to go with them (in the interim they will be accessed through cap bits), but none the less this is our first detailed view at what are expected to be the major new features of 11.3/12

    Rasterizer Ordered Views

    First and foremost of the new features is Rasterizer Ordered Views (ROVs). As hinted at by the name, ROVs is focused on giving the developer control over the order that elements are rasterized in a scene, so that elements are drawn in the correct order. This feature specifically applies to Unordered Access Views (UAVs) being generated by pixel shaders, which buy their very definition are initially unordered. ROVs offers an alternative to UAV's unordered nature, which would result in elements being rasterized simply in the order they were finished. For most rendering tasks unordered rasterization is fine (deeper elements would be occluded anyhow), but for a certain category of tasks having the ability to efficiently control the access order to a UAV is important to correctly render a scene quickly.

    The textbook use case for ROVs is Order Independent Transparency, which allows for elements to be rendered in any order and still blended together correctly in the final result. OIT is not new – Direct3D 11 gave the API enough flexibility to accomplish this task – however these earlier OIT implementations would be very slow due to sorting, restricting their usefulness outside of CAD/CAM. The ROV implementation however could accomplish the same task much more quickly by getting the order correct from the start, as opposed to having to sort results after the fact.

    Along these lines, since OIT is just a specialized case of a pixel blending operation, ROVs will also be usable for other tasks that require controlled pixel blending, including certain cases of anti-aliasing.

    Typed UAV Load

     

    The second feature coming to Direct3D is Typed UAV Load. Unordered Access Views (UAVs) are a special type of buffer that allows multiple GPU threads to access the same buffer simultaneously without generating memory conflicts. Because of this disorganized nature of UAVs, certain restrictions are in place that Typed UAV Load will address. As implied by the name, Typed UAV Load deals with cases where UAVs are data typed, and how to better handle their use.

    Volume Tiled Resources

     

    The third feature coming to Direct3D is Volume Tiled Resources. VTR builds off of the work Microsoft and partners have already done for tiled resources (AKA sparse allocation, AKA hardware megatexture) by extending it into the 3rd dimension.

    VTRs are primarily meant to be used with volumetric pixels (voxels), with the idea being that with sparse allocation, volume tiles that do not contain any useful information can avoid being allocated, avoiding tying up memory in tiles that will never be used or accessed. This kind of sparse allocation is necessary to make certain kinds of voxel techniques viable.

    Conservative Rasterization

     

    Last but certainly not least among Direct3D’s new features will be conservative rasterization. Conservative rasterization is essentially a more accurate but performance intensive solution to figuring out whether a polygon covers part of a pixel. Instead of doing a quick and simple test to see if the center of the pixel is bounded by the lines of the polygon, conservative rasterization checks whether the pixel covers the polygon by testing it against the corners of the pixel. This means that conservative rasterization will catch cases where a polygon was too small to cover the center of a pixel, which results in a more accurate outcome, be it better identifying pixels a polygon resides in, or finding polygons too small to cover the center of any pixel at all. This in turn being where the “conservative” aspect of the name comes from, as a rasterizer would be conservative by including every pixel touched by a triangle as opposed to just the pixels where the tringle covers the center point.

    Conservative rasterization is being added to Direct3D in order to allow new algorithms to be used which would fail under the imprecise nature of point sampling. Like VTR, voxels play a big part here as conservative rasterization can be used to build a voxel. However it also has use cases in more accurate tiling and even collision detection.

    Final Words

    Wrapping things up, today’s announcement of Direct3D 11.3 and its new features offers a solid roadmap for both the evolution of Direct3D and the hardware that will support it. By confirming that they are continuing to work on Direct3D 11 Microsoft has answered one of the lingering questions surrounding Direct3D 12 – what happens to Direct3D 11 – and at the same time this highlights the hardware features that the next generation of hardware will need to support in order to be compliant with the latest D3D feature level. And with Direct3D 12 set to be released sometime next year, these new features won’t be too far off either.

  • NVIDIA GameWorks: More Effects with Less Effort (AnandTech)

    While NVIDIA's hardware is the big start of the day, the software that we run on the hardware is becoming increasingly important. It's one thing to create the world's fastest GPU, but what good is the GPU if you don't have anything that can leverage all that performance? As part of their ongoing drive to improve the state of computer graphics, NVIDIA has a dedicated team of over 300 engineers whose primary focus is the creation of tools and technologies to make the lives of game developers better.

    GameWorks consists of several items. There's the core SDK (Software Development Kit), along with IDE (Integrated Development Environment) tools for debugging, profiling, and other items a developer might need. Beyond the core SDK, NVIDIA has a Visual FX SDK, a PhysX SDK, and an Optix SDK. The Visual FX SDK offers solutions for complex, realistic effects (e.g. smoke and fire, faces, waves/water, hair, shadows, and turbulence). PhysX is for physics calculations (either CPU or GPU based, depending on the system). Optix is a ray tracing engine and framework, often used to pre-calculate ("bake") lighting in game levels. NVIDIA also provides sample code for graphics and compute, organized by effect and with tutorials.

    Many of the technologies that are part of GameWorks have been around for a few years, but NVIDIA is constantly working on improving their GameWorks library and they had several new technologies on display at their GM204 briefing. One of the big ones has already been covered in our GM204 review, VXGI (Voxel Global Illumination), so I won't rehash that here; basically, it allows for more accurate and realistic indirect lighting. Another new technology that NVIDIA showed is called Turf Effects, which properly simulates individual blades of grass (or at least clumps of grass). Finally, PhysX FleX also has a couple new additions, Adhesion and Gases; FleX uses PhysX to provide GPU simulations of particles, fluids, cloth, etc.

    Still images don't do justice to most of these effects, and NVIDIA will most likely have videos available in the future to show what they look like. PhysX FleX for example has a page with a currently unavailable video, so hopefully they'll update that with a live video in the coming weeks. You can find additional content related to GameWorks on the official website.

    The holiday 2014 season will see the usual avalanche of new games, and many of the AAA titles will sport at least one or two technologies that come from GameWorks. Here's a short list of some of the games, and then we'll have some screen shots to help illustrate what some of the specific technologies do.

    Upcoming Titles with GameWorks Technologies
    Assassin’s Creed: Unity HBAO+, TXAA, PCSS, Tessellation
    Batman: Arkham Knight Turbulence, Environmental PhysX, Volumetric Lights, FaceWorks, Rain Effects
    Borderlands: The Pre-Sequel PhysX Particles
    Far Cry 4 HBAO+, PCSS, TXAA, God Rays, Fur, Enhanced 4K Support
    Project CARS DX11, Turbulence, PhysX Particles, Enhanced 4K Support
    Strife PhysX Particles, HairWorks
    The Crew HBAO+, TXAA
    The Witcher 3: Wild Hunt HairWorks, HBAO+, PhysX, Destruction, Clothing
    Warface PhysX Particles, Turbulence, Enhanced 4K Support
    War Thunder WaveWorks, Destruction

    In terms of upcoming games, the two most prominent titles are probably Assassin's Creed Unity and Far Cry 4, and we've created a gallery for each. Both games use multiple GameWorks elements, and NVIDIA was able to provide before/after comparisons for FC4 and AC Unity. Batman: Arkham Knight and The Witcher 3: The Wild Hunt also incorporate many effects from GameWorks, but we didn't get any with/without comparisons.

    {gallery 3946}

    Starting with HBAO+ (Horizon Based Ambient Occlusion), this is a newer way of performing Ambient Occlusion calculations (SSAO, Screen Space AO, being the previous solution that many games have used). Games vary in how they perform AO, but if we look at AC Unity the comparison between HBAO+ and (presumably SSAO) the default AO, HBAO+ clearly offers better shadows. HBAO+ is also supposed to be faster and more efficient than other AO techniques.

    TXAA (Temporal Anti-Aliasing) basically combines a variety of filters and post processing techniques to help eliminate jaggies, something which we can all hopefully appreciate. There's one problem I've noticed with TXAA however, which you can see in the above screenshot: it tends to make the entire image look rather blurry in my opinion. It's almost as though someone took Photoshop's "blur" filter and applied it to the image.

    PCSS (Percentage Closer Soft Shadows) was introduced a couple years back, which means we should start seeing it in more shipping games. You can see the video from 2012, and AC Unity and Far Cry 4 are among the first games that will offer PCSS.

    Tessellation has been around for a few years now in games, and the concepts behind tessellation go back much further. The net result is that tessellation allows developers to extrude geometry from an otherwise flat surface, creating a much more realistic appearance to games when used appropriately. The cobble stone streets and roof shingles in AC Unity are great examples of the difference tessellation makes.

    God rays are a lighting feature that we've seen before, but now NVIDIA has implemented a new way of calculating the shafts of light. They now use tessellation to extrude the shadow mapping and actually create transparent beams of light that they can render.

    HairWorks is a way to simulate large strands of hair instead of using standard textures – Far Cry 4 and The Witcher 3 will both use HairWorks, though I have to admit that the hair in motion still doesn't look quite right to me. I think we still need an order of magnitude more "hair", and similar to the TressFX in Tomb Raider this is a step forward but we're not there yet.

    There are some additional effects being used in other games – Turbulence, Destruction, FaceWorks, WaveWorks, PhysX, etc. – but the above items give us a good idea of what GameWorks can provide. What's truly interesting about GameWorks is that these libraries are free for any developers that want to use them. The reason for creating GameWorks and basically giving it away is quite simple: NVIDIA needs to entice developers (and perhaps more importantly, publishers) into including these new technologies, as it helps to drive sales of their GPUs among other things. Consider the following (probably not so hypothetical) exchange between a developer and their publisher, paraphrased from NVIDIA's presentation on GameWorks.

    A publisher wants to know when game XYZ is ready to ship, and the developer says it's basically done, but they're excited about some really cool features that will just blow people away, and it will take a few more months to get those finished up. "How many people actually have the hardware required to run these new features?" asks the publisher. When the developers guess that only 5% or so of the potential customers have the hardware necessary, you can guess what happens: the new features get cut, and game XYZ ships sooner rather than later.

    We've seen this sort of thing happen many times – as an example, Crysis 2 shipped without DX11 support (since the consoles couldn't support that level of detail), adding it in a patch a couple months later. Other games never even see such a patch and we're left with somewhat less impressive visuals. While it's true that great graphics do not an awesome game make, they can certainly enhance the experience when used properly.

    It's worth pointing out is that GameWorks is not necessarily exclusive to NVIDIA hardware. While PhysX as an example was originally ported to CUDA, developers have used PhysX on CPUs for many games, and as you can see in the above slide there are many PhysX items that are supported on other platforms. Several of the libraries (Turbulence, WaveWorks, HairWorks, ShadowWorks, FlameWorks, and FaceWorks) are also listed as "planned" for being ported to the latest generation of gaming consoles. Android is also a growing part of NVIDIA's plans, with the Tegra K1 effectively brining the same feature set over to the mobile world that we've had on PCs and notebooks for the past couple of years.

    NVIDIA for their part wants to drive the state of the art forward, so that the customers (gamers) demand these high-end technologies and the publishers feel compelled to support them. After all, no publisher would expect great sales from a modern first-person shooter that looks like it was created 10 years ago [insert obligatory Daikatana reference here], but it's a bit of a chicken vs. egg problem. NVIDIA is trying to push things along and maybe hatch the egg a bit earlier, and there have definitely been improvements thanks to their efforts. We applaud their efforts, and more importantly we look forward to seeing better looking games as a result.

  • The NVIDIA GeForce GTX 980 Review: Maxwell Mark 2 (AnandTech)

    At the start of this year we saw the first half of the Maxwell architecture in the form of the GeForce GTX 750 and GTX 750 Ti. Based on the first generation Maxwell based GM107 GPU, NVIDIA did something we still can hardly believe and managed to pull off a trifecta of improvements over Kepler. GTX 750 Ti was significantly faster than its predecessor, it was denser than its predecessor (though larger overall), and perhaps most importantly consumed less power than its predecessor. In GM107 NVIDIA was able to significantly improve their performance and reduce their power consumption at the same time, all on the same 28nm manufacturing node we’ve come to know since 2012. For NVIDIA this was a major accomplishment, and to this day competitor AMD doesn’t have a real answer to GM107’s energy efficiency.

    However GM107 was only the start of the story. In deviating from their typical strategy of launching high-end GPU first – either a 100/110 or 104 GPU – NVIDIA told us up front that while they were launching in the low end first because that made the most sense for them, they would be following up on GM107 later this year with what at the time was being called “second generation Maxwell”. Now 7 months later and true to their word, NVIDIA is back in the spotlight with the first of the second generation Maxwell GPUs, GM204.

  • Le jour de l'iPhone 6 (MacBidouille)

    Même si pour les APR la fête est gâchée, aujourd'hui est le grand jour de l'iPhone 6.
    Entre ceux qui vont en recevoir un via UPS et ceux qui feront la queue chez les opérateurs ou dans un Apple Store, il y en aura certainement déjà quelques centaines de milliers ce soir en France et des millions dans le monde.
    Merci à ceux qui auront la chance de l'avoir reçu de donner leur avis aux autres, ceux qui préfèrent attendre pour choisir entre un iPhone et un iPhone plus ou ceux plus prudents qui veulent de vraies expériences.

    Vous pourrez aussi nous envoyer des photos des queues devant les Apple Store ou les opérateurs et nous tenir au courant si des points de vente sont en rupture de stock.

    Bonne et longue journée pour tous ceux qui vondront approcher ou jouer avec l'iPhone 6.

    [MàJ] Thomas nous a envoyé ce commentaire et cette photo:

    A l'AS de Lyon Part dieu, un groupe d'une trentaine de personne s'est formé depuis hier 15h dont une douzaine d'italien. Ambiance bon enfant.


    En région Parisienne, les iPhone qui doivent-être livrés ce matin ont commencé à quitter les entrepôts UPS de Saint-Ouen.

  • Hands On With ODG's R-7: Augmented Reality Glasses (AnandTech)

    While it's still unclear to me what the future of wearables will be, I must admit that all things considered I feel that glasses are a better idea than watches as a form factor. If the goal is glanceable information, a heads-up display is probably as good as it gets. This brings us to the ODG R-7, which is part of Qualcomm's Vuforia for Digital Eyewear (VDE) platform. This VDE platform brings new capabilities for augmented reality. What this really means is that developers no longer need to worry about coming up with their own system of aligning content from a VR headset to the real world, as this platform makes it a relatively simple process. Judging by the ODG R-7, there's no need for a 3D camera to pull this off.

    So let's talk about the ODG R-7, one of the most fascinating wearables I've ever seen. While its primary purpose is for government and industrial use, it isn't a far leap to see the possibilities for consumers. For reference, the ODG R-7 that I saw at this show is an early rev, and effectively still a prototype. However, the initial specs have been established. This wearable has a Qualcomm Snapdragon 805 SoC running at 2.7 GHz, with anywhere between one to four gigabytes of RAM and 16 to 128 gigabytes of storage. There are two 720p LCoS displays that run at 100 Hz refresh rate, which means that the display is see-through. There's one 5MP camera on the front to enable the augmented vision aspects. There's also one battery on each side of the frame for a 1400 mAh battery, which is likely to be a 3.8V nominal voltage.

    While the specs are one thing, the actual device itself is another. In person, this is clearly still a prototype as on the face it feels noticeably front heavy, which is where all of the electronics are contained. It's quite obvious that this is running up against thermal limits, as there is a noticeable heat sink running along the top of the glasses. This area gets noticeably hot during operation, and easily feels to be around 50-60C although the final product is likely to be much cooler in operation.

    However, these specs aren't really what matter so much as the use cases demonstrated. While it's effectively impossible to really show what it looks like, one demo shown was a terrain map. When this was detected by the glasses, it automatically turned the map into a 3D model that could be viewed from any angle. In addition, a live UAV feed was just above the map, with the position of the UAV indicated by a 3D model orbiting around the map.

    It's definitely not a long shot to guess the next logical steps for such a system. Overlaying directions for turn by turn navigation is one obvious use case, as is simple notification management, similar to Android Wear watches. If anything, the potential for glasses is greater than watches as it's much harder to notice glasses in day to day use as they rely on gravity instead of tension like a watch band. However, it could be that I'm biased though, as I've worn glasses all my life.

  • NERO grave maintenant depuis les smartphones (MacBidouille)

    Les temps changent, et si l'on ne s'adapte pas, on est voué à disparaître. C'est certainement ce que se sont dit les développeurs du célèbre logiciel de gravure sur PC, NERO.
    Leur dernière version, la 2015, introduit une surprenante nouveauté, le support très avancé des smartphones sous Android et iOS (tant pis pour les autres).
    La première application est presque surprenante puisqu'il s'agit... du manuel électronique du logiciel. Pas très utile, donc, sauf à avoir un minuscule écran sur son PC.
    La seconde est déjà un peu plus intéressante. Elle permet le transfert sur PC des données que l'on souhaite sauvegarder ou stocker. Ensuite, le PC gravera un DVD avec. Autant dire qu'il faudra un paquet de DVD ou quelques Blu-ray pour archiver un iPhone de 128 Go.
    La dernière transforme le PC en serveur multimédia à partir duquel on peut envoyer vers les smartphones des vidéos. Sur Mac, ce genre de chose existe déjà par exemple chez Elgato.

    Donc, le plus grand intérêt de cette version est surtout de montrer que la gravure de CD et de DVD est en voie de disparition, en tout cas assez pour qu'il soit nécessaire de donner aux logiciels de gravure d'autres fonctions, dédiées aux chouchous du public, les smartphones et tablettes.

  • iPhone 6: nouvelle grosse pomme de discorde entre Apple et les APR (MacBidouille)

    Comme vous le savez si vous suivez notre actualité, les relations entre Apple et ses revendeurs Apple Premiun Resellers sont plutôt tendues et il y a toujours une procédure en cours entre les actionnaires du feu Ebizcuss et Apple.
    La sortie de l'iPhone 6 a été l'occasion d'un nouveau coup de sang des APR. Alors qu'ils sont officiellement autorisés à vendre les appareils d'Apple depuis relativement peu de temps, la société a décidé de ne pas leur en fournir ou presque. Les enseignes vont en recevoir entre 1 et 5 unités, et encore, seulement des modèles 16 Go, ceux qui sont le moins prisés.
    Pire, Apple a décidé que les APR ne pourraient pas vendre à leurs clients de modèles 128 Go. Ils n'auront accès qu'aux 16 et 64 Go... un jour.

    Dans le meilleur des cas ils vont commencer à en recevoir au mieux à la fin de la semaine prochaine, et plus certainement encore une semaine après... la fête.

    Les patrons de ces boutiques sont fous de rage, surtout quand ils savent que les Apple Store vont en avoir des palettes à écouler. Si vous aviez encore un doute sur le respect d'Apple vis-à-vis de ses revendeurs, nous pensons qu'il a été dissipé. Si vous pensiez encore qu'Apple pouvait avoir deux canaux de distribution, un direct et un indirect sans vouloir favoriser le premier qui lui rapporte quelques points de marge de plus, vous êtes aussi fixés.

  • Samsung produit maintenant des puces de RAM mobiles de 3 Go en 20nm (MacBidouille)

    Samsung a annoncé avoir démarré la production de puces de LPDDR3, de la RAM très basse consommation, gravées en 20nm. Les premières puces qui en bénéficieront seront composées de 6 couches de 512 Mo pour atteindre les 3 Go.
    Ces nouvelles puces permettent par rapport aux précédentes d'économiser un peu plus d'énergie et donc de gagner en autonomie.
    Samsung les utilisera certainement dès que possible dans ses smartphones et tablettes.

    Le timing de l'annonce n'est probablement pas fortuit. Elle arrive en même temps que les premières livraisons d'iPhone 6 et 6 plus, deux modèles encore dotés de seulement 1 Go de RAM, chose que certains, surtout la concurrence, critiquent fortement.

  • Microsoft Office : images fuitées de la prochaine version (Génération NT: logiciels)
    La prochaine version de Microsoft Office se dévoile un peu suite à la publication de quelques images d'une préversion technique.
  • Safari 7.1 disponible (MacBidouille)

    Merci aux lecteurs qui nous ont signalé que Safari 7.1 est disponible au téléchargement.
    Forcément, cette nouvelle mouture sera certainement plus rapide que les précédentes :)

  • Applis Windows : fin des frais annuels pour les développeurs (Génération NT: logiciels)
    Les développeurs d'applications Windows et Windows Phone n'ont plus à débourser des frais annuels.
  • Apple : mise à jour 10.9.5 pour OS X Mavericks (Génération NT: logiciels)
    Apple livre une mise jour de son système d'exploitation pour ordinateur Mac. Elle est recommandée pour tous les utilisateurs d'OS X Mavericks.
  • Intel’s Haswell-EP Xeons with DDR3 and DDR4 on the Horizon? (AnandTech)

    Johan’s awesome overview of the Haswell-EP ecosystem showed that the server processor line from Intel is firmly in the track for DDR4 memory along with the associated benefits of lower power consumption, higher absolute frequencies and higher capacity modules. At that point, we all assumed that all Haswell-EP Xeons using the LGA2011-3 socket were DDR4 only, requiring each new CPU to be used with the newer generation modules. However thanks to ASRock’s server team, ASRock Rack, it would seem that there will be some Xeons for sale from Intel with both DDR3 and DDR4 support.

    Caught by Patrick at ServeTheHome, ASRock Rack had released their motherboard line without much of a fuss. There is nothing strange about that in itself; however the following four models were the subject of interest:

    A quick email to our contacts at ASRock provided the solution: Intel is going to launch several SKUs with a dual DDR3/DDR4 controller. These processors are available in eight, ten and twelve core flavors, ranging from 85W to 120W:

    QVL CPUs for ASRock Rack EPC612D8T
      E5-2629 v3 E5-2649 v3 E5-2669 v3
    Cores / Threads 8 / 16 10 / 20 12 / 24
    Base Frequency (GHz) 2.4 2.3 2.3
    L3 Cache (MB) 20 25 30
    TDP (W) 85 105 120

    At the current time there is no release date or pricing for these DDR3 Haswell-EP processors, however it would seem that ASRock Rack is shipping these motherboards to distributors already, meaning that Intel cannot be far behind. It does offer a server team the ability to reuse the expensive DDR3 memory they already have, especially given the DDR4 premium, although the processor counts are limited.

    CPU-World suggested that these processors have dual memory controllers, and we recieved confirmation that this is true. This could suggest that all Xeons have dual memory controllers but with DDR3 disabled. Note that these motherboards would reject a DDR4-only CPU as a result of their layout. It does potentially pave the way for combination DDR3/DDR4 based LGA2011-3 motherboards in the future. We have also been told that the minimum order quantity for these CPUs might be higher than average, and thus server admins will have to contact their Intel distribution network for exact numbers. This might put a halt on smaller configurations keeping their DDR3.

    Source: ServeTheHome, ASRock Rack

  • OS X 10.9.5 (MacBidouille)

    Comme on l'escomptait, Apple propose une mise à jour OS X 10.9.5.
    Elle améliore surtout les connexions VPN et SMB.

  • SSD "modernes" : méfiez vous des débits annoncés par les constructeurs ! (MacBidouille)

    Alors que la mémoire TLC (qui stocke trois bits par cellule) est moins performante que la MLC classique (deux bits par cellule) qui équipe la majorité des SSD grand publics, les chiffres de débit annoncés par les constructeurs pour leurs modèles à base de TLC sont souvent excellents, avec des niveaux équivalents à ceux des SSD MLC. Il faut toutefois se méfier encore plus de ces chiffres que d'habitude...

    En effet, pour arriver à ces bons débits, Samsung et Sandisk, les deux principaux constructeurs à utiliser de la TLC, ont intégré à leurs SSD des mécanismes de pseudo "cache" qui consistent à utiliser une partie de la mémoire comme s'il s'agissait de SLC (un bit par cellule). Les performances en écriture sont alors bien supérieures, mais uniquement tant que ce "cache" n'est pas rempli. Le SSD profite ensuite des périodes de repos pour transférer ces données en mode TLC.

    On savait déjà que sur les Samsung 840 EVO, les performances en écriture séquentielle étaient divisées par deux à trois selon les capacités une fois le "cache" rempli. C'est désormais au tour de Sandisk de se faire épingler par AnandTech sur ce point : l'Ultra II de 240 Go, spécifié pour 400 Mo/s en séquentiel, tombe à 240 Mo/s une fois le "cache" plein. Cette baisse de performances est toutefois plus tardive que sur le Samsung de 250 Go, qui n'embarque que 3 Go de "cache", contre 10 Go pour le Sandisk.

    Dans une moindre mesure, les nouveaux M600 de Crucial sont également concernés par cette chute de performances, mais uniquement sur les petites capacités (128 et 256 Go) et les modèles en barrettes. Ils utilisent bien de la mémoire MLC, la même que dans le MX100 dont les débits en écriture plafonnent à 150 Mo/s en 128 Go et 330 Mo/s en 256 Go, mais Crucial annonce 510 Mo/s pour toute la gamme M600. Pour y parvenir, le SSD va également fonctionner en mode SLC, mais cette fois sur un volume limité uniquement par la capacité totale : tant que le SSD est rempli à moins de 50%, seul un bit est écrit dans chaque cellule, au delà les données seront réorganisées pour passer à 2 bits par cellule.

    Alors que les SSD Samsung et Sandisk à base de TLC s'écroulent lors de longues opérations d'écriture, ceux de Crucial ne s'écrouleront donc que lorsque le taux de remplissage dépassera 50%. Mais le résultat est le même : dans les deux cas on se retrouve avec des débits largement inférieurs aux promesses des constructeurs.

    Le débit en écriture séquentielle ne devrait du coup plus être utilisé pour comparer les SSD, tant cette valeur perd de son sens quand les constructeurs mettent en place des mécanismes qui peuvent la faire varier du simple au triple en quelques instants...

    Notons tout de même que ces contre-performances sont à relativiser, puisque l'intérêt d'un SSD reste avant tout ses temps d'accès faibles, tandis que les débits en écriture séquentielle on relativement peu d'influence sur les performances perçues à l'usage, sauf lors de la manipulation de très gros fichiers.

  • Une autre présentation d'Apple en octobre (MacBidouille)

    Selon Appleinsider, Apple fera un nouveau show de présentation produit durant la seconde moitié du mois d'octobre. Tout porte à croire qu'il concernera la future gamme d'iPad mais il est fort possible que la société en profite aussi pour présenter de nouveaux Mac, pourquoi pas un iMac Retina ou.... son premier Mac ARM.

    Dans la liste des produits attendus ou tout du moins annoncés à plus ou moins brève échéance par les rumeurs, il y a l'iPad 12", des MacBook Air Retina, une nouvelle version de l'Apple TV... Il y a donc de quoi faire et pour justifier un second Special Event, Apple va forcément vouloir frapper fort.

  • Foxconn assemble 540000 iPhone 6 chaque jour (MacBidouille)

    Il est des chiffres incroyables tant ils sont importants. Ainsi, le Wall Street Journal rapporte que Foxconn, actuellement, sort de ses usines 540000 iPhone par jour, 400000 iPhone 6 et 140000 iPhone 6 plus.
    Pour atteindre ce volume la société a mis en place 100 lignes de production et ce sont 200000 personnes (soit la population de la ville de Rennes) qui travaillent pour assembler ces appareils.
    Foxconn aurait d'ailleurs des difficultés inédites à assembler ces appareils plus grands et continuerait à chercher le moyen de monter en charge sur le modèle 5,5", ce qui n'est pas facile car les écrans n'arrivent pas en nombre suffisant.

    Pour vous donner une idée de ce que représente ce volume, ces appareils pèsent au total plus de 75 tonnes et si l'on les met bout à bout dans leur longueur, il y en a pour 76 km de long !

  • Le VDSL2 en distribution indirecte arrive (MacBidouille)

    Depuis bientôt un an, les opérateurs ADSL proposent à leurs clients le protocole VDSL2, qui permet à ceux qui sont proches des répartiteurs d'atteindre le débit maximum théorique de 100 Mbits/s.
    Si certains peuvent déjà en bénéficier, tous ceux qui avaient entre leur connexion et leur domicile un sous-répartiteur ne pouvaient y accéder. Cette limitation légale sautera à partir du 27 octobre prochain.
    Ce sont donc bien plus de personnes qui pourront en bénéficier. Toutefois, selon l'ARCEP, seuls 14,5% des lignes téléphoniques sont assez courtes pour que cette technologie soit profitable par rapport à l'ADSL. La fibre optique est donc encore et toujours la meilleure voie d'avenir pour remplacer la paire de cuivre.

  • Lexar annonce les cartes SD les plus rapides du monde (MacBidouille)

    La guerre au débit des cartes SD est loin d'être finie. Depuis février dernier Sandisk avait à son catalogue le produit le plus rapide du monde, capable d'atteindre les 280 Mo/s en lecture.

    Lexar vient de le battre en annonçant ses premières cartes capables d'atteindre les 300 Mo/s. Disponibles en 32 et 64 Go elle coûteront respectivement 105,99$ et 184,99$. 
    Il faudra bien entendu un appareil capable de supporter de tels débits et un lecteur USB 3.0 ou Thunderbolt pour en bénéficier, et justement, Lexar en commercialise.

  • Les bénéfices d'Adobe en nette baisse (MacBidouille)

    Depuis quelques temps Adobe a décidé de revoir totalement son modèle économique. Plutôt que de vendre ses logiciels ou suites très cher à chaque sortie, la société a décidé de passer à un système d'abonnement mensuel ou annuel. Ainsi, elle peut lisser ses résultats sans avoir de pics au moment des sorties puis de longues périodes de calme.
    Visiblement la transition de la société n'est pas terminée. Si la société revendique avoir gagné 500000 clients depuis le passage à ce nouveau système, son chiffre d'affaires ne progresse pratiquement pas, passant au dernier trimestre à 1,01 milliard de dollars (contre 995 millions auparavant) et son bénéfice est même en nette baisse, 44,7 millions contre 83 millions sur la même période de l'an dernier.

    Certes, la transition n'est pas terminée et avec le temps, des possesseurs d'anciennes versions devront passer au cloud, mais il est aussi possible que des personnes aient préféré ne louer au long terme que des logiciels dont elles ont le besoin au quotidien, quitte à prendre ponctuellement un abonnement aux autres produits en fonction des besoins.

  • La mise à jour vers iOS 8 est disponible (MacBidouille)

    Pendant que certains trépignent d'impatience de reçevoir le dernier iPhone 6, surtout si vous avez un modèle trop ancien ne supportant pas le dernier OS tactile d'Apple (iPhone 4 et inférieur), d'autres peuvent déjà l'installer sur leurs terminaux préférés. Rendez-vous dans le tableau de bord : Réglages > Général > Mise à jour logicielle. Attention 1,9 Go à télécharger tout de même, il risque d'y avoir des embouteillages ce soir chez certains FAI !

    En piqûre de rappel si vous n'avez pas suivi l'actualité ces derniers temps à propos des nouvelles fonctionnalités d'iOS 8, rendez-vous sur la page dédiée sur le site d'Apple.

     

    [Mise à jour]

    La taille de la mise à jour semble varier entre les différentes versions d'iOS d'origine, comme ici entre une version 7 et… 5 ! ;)

  • The iOS 8 Review (AnandTech)

    Another year has passed and like clockwork Apple has released a new iPhone and a new version of iOS to accompany it. Our reviews of both new iPhones will be coming soon, with a look at new iOS features specific to those devices like ApplePay, but with iOS 8 rolling out today to millions of existing iOS users across the iPad, iPhone, and iPod Touch, it's worth taking a look at what Apple is bringing to the users that are already in the iOS ecosystem. The eighth iteration of Apple's mobile operating system brings some new features, and while on the surface it may appear quite similar to iOS 7, under the hood the changes are quite significant. If iOS 7 was the biggest update for users in the seven years since the iPhone and iOS first appeared, then iOS 8 is the biggest update for developers since the launch of iOS 2.0 and the App Store. Read on for our full review.

  • Logitech Targets Home Automation Play with Harmony Living Home Lineup (AnandTech)


    Home Automation and Control - Setting the Stage

    The increasing popularity of home automation (HA) equipment has fueled the Internet of Things (IoT) revolution. However, the low barrier to entry (there are innumerable crowdfunded projects in this space) has resulted in a very fragmented ecosystem. Interoperability is a major concern, and different devices use different protocols. In order to get a seamless experience across all home automation equipment, consumers have been forced to go the custom installation or integrated package route. These avenues tend to keep the joys of home automation and control out of reach of the average consumer.

    The current market situation is ripe for someone to come in with a home automation gateway. Vendors such as Lowes (with the Iris product line) and Staples (with the Staples Connect initiative) have made interesting forays. However, the primary aim has been to sell more connected peripherals under the same brand. Interoperability with other HA devices is not given any importance.

    On the other side, we have vendors such as Securifi trying to integrate a home automation gateway into a standard wireless router with their Almond+ product. All things considered, it would be best if the wireless router at home were to act as a home automation gateway. Consumers don't need to buy yet another device to act as a gateway purely for their IoT clients. The problems would then be making sure that various HA devices can talk to the gateway and consumers have the ability to interact with all of them using one interface. Unfortunately, these aspects have contributed to Securifi delaying the retail launch of the Almond+. Under these circumstances, the slot is still open for a unified home automation controller. Logitech is hoping to fill that void with today's Harmony Living Home launch.

    Logitech Harmony - A Brief Background

    Logitech's Harmony lineup is very well respected in the universal remote control market. The ability of a single remote / hub device to control multiple home entertainment devices (AVR / TV / media players) coupled with one-touch control and simple setup has been well-received by the consumers. In fact, Harmony's database of over 200K devices (which is also frequently updated) is unparalleled in the industry. The only downside of the units is the pricing aspect.

    Prior to today's launch, the scope of the Harmony lineup didn't go beyond control of entertainment devices in the living room. However, the current popularity of home automation devices and the IoT ecosystem (coupled with the rapid rise of mobile devices that enable easy control via apps) make the next stop for the Harmony lineup quite obvious. Logitech is launching four new product SKUs centered around a home automation gateway hub under the Harmony Living Home category:

    • Logitech Harmony Home Hub
    • Logitech Harmony Home Control
    • Logitech Harmony Ultimate Home
    • Logitech Harmony Hub Extender

    Logitech Harmony Living Home Lineup - Delving Deeper

    The Logitech Harmony Home Hub connects to the home network and uses RF, IR, Bluetooth and Wi-Fi to relay commands from the Harmony mobile app or the Harmony remote to all supported entertainment and automation devices. The Harmony mobile apps can work over the Internet. True remote control of the various devices in one's home from anywhere on the Internet is now possible.

    Logitech Harmony Home Hub and Mobile App

    Consumers can purchase the hub alone for $100 and use the full functionality with just the mobile app. As with any home automation setup, scenes can be programmed involving multiple devices from different vendors. Logitech terms these scenes as experiences.

    The next 'upgrade' in the Living Home lineup is the Logitech Harmony Home Control that costs $150. This kit bundles a button-only remote with the hub described above.

    Logitech Harmony Home Control and Mobile App

    The remote communicates via RF, enabling the hub to be placed in a closed cabinet (if necessary). The mobile apps are obviously compatible with the hub even when the physical remote is being used. This configuration can control any number of home automation devices, but only up to eight entertainment devices.

    The highest end configuration is the Logitech Harmony Ultimate Home. It is quite similar to the Harmony Home Control, except for a few updates to the remote control itself: a 2.4" clour touchscreen, gesture control and additional programmability.

    Logitech Harmony Ultimate Home and Mobile App

    The kit including the hub and the touchscreen remote will retail for $350. This configuration can control up to fifteen entertainment devices and virtually unlimited number of home automation devices.

    In addition to the above three configurations (which will be available for purchase this month), Logitech will also be introducing the Logitech Harmony Hub Extender in December for $130. This extender will expand compatibility by allowing the hub to talk to devices that communicate using ZigBee or Z-Wave. Logitech also stressed the fact that the extender will be Thread-compatible.

    Concluding Remarks

    The Living Home lineup is a welcome addition to the home automation market. However, Logitech faces a few challenges. There are also a few questionable decisions that have been made with respect to the operating details.

    1. Entertainment device manufacturers have typically adopted a hands-off approach after selling their wares to the consumers. As such, they don't have any issues sharing methods to control their equipment with Logitech. On the other hand, many of the IoT / home automation device makers treat their customers as recurring revenue sources by adopting subscription models. Some of them also want to tightly control the customer experience within a walled ecosystem. Under these circumstances, it is not clear how willing they would be to share their APIs with Logitech or work to make their products compatible with the Harmony platform. That said, Logitech says more than 6000 home automation devices are currently compatible with the hub, and the number is expected to keep growing.

    2. Logitech is not adopting a subscription fee model for the Living Home lineup. While this is excellent news for consumers, it would be interesting to see what keeps the cloud servers for the external control aspect running in the future. It might not be a big deal for a company of Logitech's size, but it leads to another aspect - decentralized control.

    3. Based on the initial information provided to us, it looks like the Logitech Living Home lineup requires the hub to be always connected to the Internet for it to control the connected devices. This makes sense for devices that currently offer cloud-based control only. But, we are at a loss to understand why devices that can be controlled via the local network itself (such as, say, the UFO Power Center from Visible Energy and the Ubiquiti mFi mPower strips) need an Internet connection when accessed through the hub while being part of the local network. In our opinion, the control logic (i.e, processing the APIs that talk to the various devices) should be resident on the hub rather than on the cloud.

    4. It is not clear whether it is possible for third-party apps to talk to the hubs. Logitech does have a developer program for device makers to make their products compatible with the Harmony home hub. While Logitech indicated that the products being launched today can talk to the recently SmartThings and PEQ hubs, the availability of APIs for the Logitech hub itself remains an open question.

    In conclusion, the launch of the Harmony Living Home lineup looks to be just what the home automation market needs. If Logitech can replicate their success with home entertainment control in this space, it solves a very important problem for the consumers and will allow consumers to invest in home automation without the risk of a fragmented experience. A reputable and reliable company had to get serious about this space, and we believe Logitech has the right play here.

  • The New Motorola Moto X (2nd Gen) Review (AnandTech)

    While I talked about Motorola’s issues in the launch article for the new Moto X, it’s well worth repeating. Motorola has been through a lot these past few years. Once the iconic symbol of Android with their Droid smartphones, Motorola had lost its way. It wasn’t unusual to see one phone launch after the other, with no real regard for strategy, and no real cohesive message to tie all of their devices together. If anything, there was a point where Motorola had become an ODM for network operators in the US, with no real international presence. After Google acquired it in 2012, we saw the launch of the Moto X in 2013. The amount of hype that I saw online before the announcement of the Moto X was unlike anything I’ve ever seen.

    Unfortunately, the device that launched didn’t quite fit with the hype. The Snapdragon S4 Pro chipset was decidedly mid-range by the time it launched. The display was good for the time, but AMOLED wasn’t quite the imminent LCD replacement that it is today. The camera was also rather unfortunate at launch. For better or worse, the Moto X was a phone with the right size and shape, but a lot of hardware choices that aged poorly. This leads us to the new Moto X. On the surface, this phone corrects a lot of issues that were present in the original Moto X. The new Moto X brings an SoC that is up to par with its competition, a new camera with a Sony sensor, and an improved AMOLED panel. To find out how it performs, read on for the full review.

  • USB Power Delivery v2.0 Specification Finalized - USB Gains Alternate Modes (AnandTech)

    The last while has been a busy time for the USB 3.0 Promoters Group, with the new USB 3.1 Type-C Connector detailed last month. Joshua was able to get a hands on with the new connector at IDF last week. With support for up to 10 Gbps, a new reversible Type-C connector, and up to 100 watts of power delivery, the USB group is trying to expand the already universal connector to be able to do much more than is possible with the current specification. To fulfill this mandate, they have now finalized the USB Power Delivery v2.0 spec, and the Billboard Device Class v1.0 spec.

    When USB was first introduced, the thought was that it would be primarily a data interface, with a limited amount of power delivery which was generally used to power the electronics of certain devices. The initial specification for USB only had provisions for 0.75 watts of power – 150 mA at 5 V. USB 2.0 bumped that to 500 mA, or 2.5 watts, and USB 3.0 specified 900 mA at 5 V, or 4.5 watts. All of these specifications allow for power as well as data transmission at the same time. In addition, there was also a Battery Charging specification which allows up to 1.5 A at 5 V for a maximum of 7.5 watts of power but with no data transmission available. The jump from 7.5 watts to 100 watts of the new specification is a huge increase, and one that cannot be done with just an amperage increase on the system as was done in the previous versions of USB. Version 3.1 now supports 5 V, 12 V, and 20 V on the pins to allow the higher power output without excessive current, but even the current has been increased to a maximum of 5 A which is much higher than before.

    The inelegant USB 3.0 Micro-B connector

    USB Power Delivery is designed to increase the flexibility of USB, by providing enough power for many more devices, while at the same time still allowing data delivery. It is also even more flexible, due to a couple of changes. First, the direction of power delivery is no longer fixed. Imagine a tablet with a keyboard attached. The keyboard can have a battery, and the battery can be charged through the data connection, but when the tablet is unplugged from its charger, the power flow can reverse and the tablet can now be powered by the keyboard. Another example is a laptop with six USB ports. The USB ports can be used for peripherals, or, a USB charger can be connected to any port to charge the laptop. Dedicated charging connectors will no longer be required.

    The reversible USB 3.1 Type-C connector

    Another change is that all devices must now negotiate the amount of power required, and that can be renegotiated if another devices requires additional power. A good scenario would be if you have a laptop, and you are charging your phone on one of the USB ports. The phone would be pulling the maximum amount of power it can in order to charge quickly. If you then plug in a USB RAID array, it will need additional power at the start in order to get all of the disks spinning, but then can be lowered to a steady state. The system can lower the power delivery to the phone, provide it to the RAID array, and then move it back to the phone when the power is available.

    The final key is that the Power Delivery specification is not just for power, nor is it just for USB. The Power Delivery Specification allows Alternate Modes to be defined, and the system can negotiate to enable these modes, enter them, and exit them. These modes will be defined outside the scope of USB-IF specifications using Structured Vendor Defined Messages. This allows the ability to reconfigure some of the pins a USB Type-C connector exposes and will allow the cable to be used for many different purposes rather than just for USB.

    This leads us to the second specification – the Billboard Device Class. This specification outlines the methods used to communicate the Alternate Modes supported by the Device Container to a host system. It includes descriptors which can be used to provide support details in a human-readable format. What it does not contain is the methodology to switch to the Alternate Mode – that is done in the Power Delivery specification itself. The Billboard Device Class will allow a device which supports an Alternate Mode to connect to a host which does not support that mode, and then inform the user why it does not work without having silent failures, and for this reason all Billboard Devices must support USB 2.0 as a minimum.

    This new framework could open the ubiquitous USB cable up to an entirely new array of devices and functions. One possibility that the USB-IF mentions in the specification is a theoretical means for PCI-E over USB. I’ve already given the example of a tablet with a battery in the keyboard, but a laptop could be connected to a monitor which can also charge the laptop. Much more power hungry devices can be connected to a USB port as well, including printers and high speed storage. All of this was defined with a graceful fail as well, so the user is not stuck wondering why his device is not functioning any longer.

    The new Alternate Modes have quite a potential, and with the increased power capabilities of USB 3.1 and the Power Delivery Specification, it will be very interesting to see how these capabilities are taken advantage of in future devices.

  • LG sera le seul fournisseur d'écrans AMOLED pour l'Apple Watch (MacBidouille)

    On ignorait tout de l'écran de l'Apple Watch et l'on pouvait présumer que la société serait restée fidèle au LCD.
    Selon Digitimes la montre sera dotée d'un écran AMOLED (moins gourmand en énergie que le LCD). De plus, seul LG en fournira à la société qui semble une fois de plus avoir fait l'impasse (parce que c'était possible) sur Samsung, l'autre fabricant de ce type de produits.
    Toujours selon Digitimes, Apple espèrerait en écouler durant l'année 2015 50 millions d'unités, ce qui est bien plus que le total des autres montres connectées déjà commercialisée, même d'ici la fin 2015.

  • Apple est membre de l'organisation GlobalPlatform (MacBidouille)

    Nous donnons la parole à Nicolas.

    Bonjour,

    voici une information qui est peut-être passée inaperçue:
    Apple est devenue membre de l'organisation GlobalPlatform.

    GlobalPlatform est une association à but non lucratif qui vise à promouvoir des spécifications pour gérer de façon interopérable le déploiement et la maintenance d’application dans des ‘zones sécurisées’.

    C’est notamment le cas avec le Secure Element que contient l’iPhone 6. C’est lui qui héberge les applications de paiement Visa, MasterCard et American Express et le fait de se reposer sur des spécifications GlobalPlatform (déjà utilisées par beaucoup de fabriquant de carte à puce, SIM/USIM en autres) permet à Apple de les gérer de façon homogène.

    En participant à GlobalPlatform, Apple a maintenant la possibilité de peser sur les spécifications pour qu’elles contiennent des fonctionnalités qui l’intéresse.

    Certainement que les spécifications TEE (Trusted Execution Environment : environnement sécurisé dans le A8 par exemple) sont aussi intéressantes pour Apple car elles définissent comment intégrer cette zone sécurisée et la faire discuter avec des zones non-secure (iOS – je ne suis pas en train de raconter que c’est une passoire, juste que iOS ne tourne pas dans un TEE).

    Voici les sources pour en savoir plus :

    Il y a aussi cet article qui est intéressant à propos de Apple Pay : http://bankinnovation.net/2014/09/heres-how-the-security-behind-apple-pay-will-really-work/ mais je ne suis pas certain que tout soit absolument correct (à propos du cryptogram livré avec le token).

    Souhaitons donc que l’adhésion d’Apple à GlobalPlatform soit un signe d’ouverture !

  • TODO : nouvelle alliance pour faciliter l'Open Source (Génération NT: logiciels)
    Dropbox, Facebook, GitHub, Google ou encore Twitter sont les membres fondateurs du groupe TODO. Une collaboration autour de l'Open Source.
> Je me souviens qu'en terminale j'avais un cours de maths de 2 heures le
> samedi...
Moi aussi. De 7h à 10h du mat (pour finir le programme et rattraper le retard)
-+- MW in GFA : Matthieu, mateu du mathin -+-