Wednesday, February 22, 2012

Google to sell Android-based heads-up display glasses this year


Google head up display
It's not the first time that rumors have surfaced of Google working on some heads-up display glasses (9 to 5 Google first raised the possibility late last year), but The New York Times is now reporting that the company is not only working on them, but that it's set to release them by the end of this year. Citing "several Google employees familiar with the project," the paper's Nick Bilton reports that the glasses will be based on Android, pack 3G or 4G connectivity, plus GPS and a range of sensors, and cost "around the price of current smartphones," or somewhere between $250 and $600. They're also said to include a low-resolution camera that can monitor your surroundings in real time and overlay relevant information, although Google is said to be paying attention to potential privacy concerns, and "wants to ensure that people know if they are being recorded by someone wearing a pair of glasses with a built-in camera."

What's more, the Times says that none other than Sergey Brin is a "key leader" on the project, with another being Google engineer Steve Lee, the creator of Latitude. Notably, Bilton also says that Google sees the project as an "experiment that anyone will be able to join," and that the company is not currently thinking about potential business models for the glasses, which could suggest that they may be more of a small-scale hobby than part of a major push into consumer hardware.

Saturday, August 7, 2010

MSI Big Bang Fuzion Motherboard: Review

For : Supports up to 3-way cross-vendor multi-GPU configuration, the automatic OC switch and OC device work very well, excellent performance.


Against : No USB 3.0 ports, no SATA 6 Gb/s support, the addition of a second graphics card makes the SATA ports supported by the discrete controllers inaccessible.
SPECIFICATION
Rs 25,000
www.msi.com
Chipset: Intel P55;
Memory: DDR3-1333 (16 GB Max);
SATA: 6 ports by Intel P55,
4 ports by two JMicron JMB322 controllers and 2xeSATA,
USB: 14 (4 via on-board headers);
Expansion slots: 3x PCIe x16, 2x PCIe x1 and 2x PCI;
LAN: Dual gigabit.

MSI has come out with some good high-end boards in the past, like the Intel X58-based Eclipse Plus, which was SLI-ready and came with a discrete X-Fi sound card, and the quad-CrossFireX-ready 790FX-GD70. MSI’s current focus is on their Big Bang series of super-high end motherboards targeted at enthusiasts and gamers. The latest addition to the series is the Big Bang-Fuzion, which has loads of interesting features to offer, the most interesting of them being cross-vendor multi-GPU support courtesy  Lucid Hydra 200 chip.

Features
The motherboard is built around the Intel P55 chipset and supports Intel Core i3, Core i5 and Core i7 CPUs that come in the socket LGA 1156 package. The chipset supports up to 16 GB DDR3 memory and offers six SATA 3 Gb/s ports and 12 USB 2.0 ports. In addition to this, there are two pairs of SATA 3 Gb/s ports, each supported by a JMicron JMP322 controller. Instead, MSI should have added support for SATA 6 Gb/s. We were surprised to find even USB 3.0 ports missing from the feature set, which are nowadays common to nearly all high-end and many mainstream motherboards.
The rear panel is quite elaborate; there are 10 USB ports (two USB/eSATA combo ports), dual gigabit Ethernet ports, PS/2 ports, a FireWire port, and a connector for the bundled overclocking device called OC Dashboard. It serves two purposes. Firstly, it displays the initialization status of the subsystems during POST, which helps in diagnosing faulty hardware or inappropriate BIOS settings. The debug codes and detailed instructions for using the OC Dashboard are given in a separate user guide. Secondly, you can overclock the CPU and memory and tweak the voltages (CPU, memory and chipset) from Windows. The values are displayed on the device’s OLED screen. You can also use CPU-Z to check the effective CPU and memory speeds. A small array of connectors and four dip switches are present near the RAM slots. The connectors are designed to check the voltages using a multimeter, and the dip switches boost the voltages (CPU, CPU_VTT, memory and chipset) and increase the voltage adjustment range in the BIOS.
MSI has also bundled a utility called MSI Control Center that displays system information and lets you overclock by simply dragging sliders for various parameters. Additionally, there are three overclocking presets - cooling, cinema and gaming. To make overclocking easy, the board features an automatic overclocking mechanism called OC Genie, which comprises a switch and an OC processor. On activating the switch, the OC processor calculates the optimum overclocked speeds and voltages and applies the values in the BIOS. OC Genie overclocked the Intel Core i7-870 from 2.93 GHz to 4.0 GHz without any issues. The package includes separate user guides for MSI Control Center, OC Genie and Overclocking.

NVidia GeForce GTX 295 Graphics card - The Fastest 3D card available in the market


The good: Best single-card 3D performance available; more power efficient than its competition; PhysX support adds some bells and whistles to a few games; DVI and HDMI output.
The bad: Still a power hog, despite its relative efficiency.
The bottom line: Nvidia's GeForce GTX 295 is the single fastest 3D card on the market, and for a relatively aggressive price. Added bonuses like power efficiency and PhysX support sweeten the deal, but even without those extra benefits, we'd still recommend this card for its processing power and comparative value.
Review:
ATI has given Nvidia some staunch competition on the 3D card front the past six months or so, but with the dual-chip GeForce GTX 295, Nvidia has raced back to the top of the performance pile. At $500 for a boxed version (from Nvidia's board partners), the GTX 295 is aimed at serious PC gamers, but it's also the best value among high-end boards, taking out the best chips from ATI. This card requires a beefy PC to run it because of significant power demands, but for anyone with the financial and electrical wherewithal to put the GTX 295 to work, you'll enjoy the best 3D hardware currently on offer.
Like its primary competition, the ATI Radeon HD 4870X2, the GeForce GTX 295 uses the familiar two-chips, one-card model we've seen from both Nvidia and ATI in the past. The Radeon HD 4870 X2 has been popular component in a few recent high-end gaming PCs, and with support for multiple graphics chips and graphics cards so prevalent in PCs these days, these dual-chip cards provide gamers with a relatively easy way to set up a quad GPU configuration.
The popularity of ATI's card had to do with the fact that it outperformed Nvidia's previous high-end behemoth, the $600 single chip GeForce GTX 280, for roughly $100 to $150 less. The GeForce GTX 295 closes both of those gaps, and also offers some noticeable power consumption savings.
AMD's aggressive pricing of its high-end Radeon cards surely contributed to Nvidia bringing the GeForce GTX 295 in for under $600. Nvidia suggested $500 as the starting price for this card, and retailers seem to be following that line so far. This is roughly the same as the price for stock Radeon HD 4870 X2 cards.
Nvidia GeForce GTX 295Asus EAHD4870X2
Price$500$479
Manufacturing process55nm55nm
Core clock576MHz750MHz
Stream processors240 (2)800 (2)
Stream processor clock1.24GHzNA
Memory1792MB2GB
Memory speed2.0GHz DDR33.66GHz DDR 5
Comparing the speeds and specs above might at first glance seem to give the Radeon the engineering advantage over the Nvidia card. Nvidia uses slower, older RAM, and less of it, and both its core clock speed and the number of stream processors (the processing pipelines on the chip that handle various kinds of data requests simultaneously) are lower as well. We suspect Nvidia has two less obvious advantages at work that help its performance.
One is its manufacturing process. The GTX 295 uses two 55-nanometer GTX 200 graphics chips, and cramming two of its older 65-nanometer GTX 200 chips onto one card would have been a power consumption nightmare. We also have no information from ATI on the speed of its stream processors. Our suspicion is that they're significantly slower than the 1.24GHz stream clock on each chip in the GTX 295.
For some background on our 3D card testing methodology, we picked our test resolutions to correspond with the native resolutions of 19-inch, 22-inch, and 24-inch wide-screen LCDs. The only oddball is Crysis, which for some reason will support the 16:9 aspect ratio common to HDTVs, but not 16:10, common to wide-screen PC displays. These being the highest-end 3D cards on the market, we also picked the highest possible image quality settings for each game, with the exception of anti-aliasing. For AA we kept to 8x and avoided chip-specific anti-aliasing settings wherever possible, although the GeForce GTX 295 can hit up to 16x AA, depending on the game. We made a custom time demo for Left 4 Dead (before this week's patch, which unfortunately broke our demo file), but in all other cases we use built-in benchmarks, or in the case of Crysis, the downloadable Assault Harbor time demo included with Mad Boris' Crysis benchmarking tool.
Regardless of the technical explanation, the GeForce GTX 295 card was simply faster than the Asus EAHD 4870 X2 card on almost all of our 3D gaming tests. The only exception is a one frame advantage for the Asus card on the 1440x900 Left4Dead test. In fairness, the GTX 295's wins aren't by embarrassing margins either, but the Far Cry 2 scores in particular are large enough to be noticeable. Given the convincing lead by the GTX 295 across multiple game engines, in both DirectX 9 and DirectX 10, and at multiple resolutions, we're comfortable recommending it as the top single card you can buy.
Power consumption
(Shorter bars indicate better performance)
Load  
Idle  
Nvidia GeForce GTX 295
418 
196 
Asus EAH4870X2
447 
212 
We're also happy to point out that the GTX 295 is relatively power efficient compared with the Asus card. We say relatively because Nvidia's card still consumes more than 400 watts under load. That's more power draw for a 3D card alone than required by most budget desktops. Still, the GeForce card is measurably more efficient than the Radeon. This relative efficiency is another benefit of moving the GTX 200 chip to the 55-nanometer manufacturing process mentioned above. The GTX 295 requires one six-pin and one eight-pin connection to your PC's power supply, and because we'd recommend a 750-watt or better power supply to go with this card, we can't exactly argue that it's the greenest component out there. But if you must spend $500 or so on a top-end 3D card, the GTX 295 is at least greener than its competition.
There is, of course, more to the story of the GTX 295 and graphics cards in general. Nvidia has been particularly vocal about capabilities of its 3D cards beyond mere triangle processing. For example, Nvidia made a physics driver available earlier this week to allow support for its PhysX accelerated game physics software in the PC version of Mirror's Edge. The GTX 295 was able to handle the added processing work without a hitch, but we can't say we found the added effects that worthwhile. Yes, cloth and particle effects like shattering glass and smoke behaved more realistically, bouncing off surfaces and responding to your actions. But in very few cases do the added effects have more than a cosmetic improvement to the game experience, and even then they feel tacked on to the Mirror's Edge world (which already has a modular, impersonal feel).
This is not to say we're against PhysX, accelerated game physics in general, or Nvidia's other efforts to differentiate its hardware beyond simple frame rates. What we'll call the parallel programming effort, as represented by Nvidia's CUDA, Apple's OpenCL, ATI's Stream processing, and Microsoft's forthcoming parallel computing support in Windows 7, via DirectX 11, will likely affect commonly used software in the coming years, and we're excited to see what develops. But while Nvidia and ATI both offer some parallel processing in dribs and drabs now, we have yet to see an implementation of this capability that compels us toward one vendor's hardware over another's.
Finally, home theater enthusiasts (and even some PC LCD owners) will be glad to know that like our engineering sample, all of the retail GeForce GTX 295 boards ship with two DVI outs and an HDMI output. You still need to wire the audio signal from your PC's digital output to the card itself (a hassle ATI avoided by integrating an audio chip into all of its new 3D cards), but once that's done, connecting the GTX 295 to an HDTV or a projector should be simple.
Test bed configuration:
Windows Vista Ultimate SP1 64-bit; 3.2GHz Intel Core i7 965; Intel X58 chipset; 4GB 1,066MHz DDR3 SDRAM; 150GB 10,000rpm Western Digital Raptor hard drive

Crysis (Assault Harbor, DirectX 10, 64-bit, very high, 8x AA)
(Longer bars indicate better performance)
1,400 x 960  
1,680 x 1,050  
1,920 x 1,080  
Nvidia GeForce GTX 295
40 
34 
29 
Asus EAH4870X2
32 
28 
26 
Far Cry 2 (ranch medium, DirectX 10, very high)
(Longer bars indicate better performance)
1,440 x 900  
1,680 x 1050  
1,920 x 1200  
Nvidia GeForce GTX 295
87 
76 
62 
Asus EAH4870X2
76 
63 
53 
Left 4 Dead (DirectX 9, 8x AA, 16x AF, very high)
(Longer bars indicate better performance)
1,440 x 900  
1,680 x 1050  
1,920 x 1200  


Nvidia GeForce GTX 295
161 
154 
141 
Asus EAH4870X2
162 
148 
133