| Suscríbete vía RSS

IBM and AMD first to announce 22nm SRAM chips

| 0 comments |

IBM recently announced the production of its first functional 22nm SRAM cell. This is not the final processor as they are still 3 years into the future. The SRAM chips are basically the first semiconductor devices that are used to test a new manufacturing process.

These were built on the conventional 6-transistor design and on a 300mm wafer. This allows the SRAM cell to shrink to a mere 0.1 sq. " m compared to SRAM cells of 45nm proccessors which are 0.346 sq. " m.



Intel first demonstrated its 32nm SRAM cell in September last year and seems to be on track with its 32nm micro processor production which is codenamed Westmere. Meanwhile AMD still has a lot of catching up to do in launching its 45nm processors while Intel is all set to launch its second iteration of its 45nm line up, the Core i7 (Nehalem).

Inspite of the hype, the Phenoms failed to perform well compared to the Core2's and now their 45nm processors will attempt to take on Intel's Core i7.

Another thing IBM mentioned is that they will be using a 32nm high-K metal gate technology that no other company has used till now. Intel has been using this very same technology on its 45nm Penryns since last year, so its interesting to know what changes they have made.

While there is still time for the 32nm and 22nm processors to actually see the light of day, its good to know that IBM is already one step ahead and maybe this time Intel will have to re-think their strategy if they want to stay ahead of their game.

Yahoo launches iPhone-optimized search site

| 0 comments |

Earlier this week, Yahoo rolled out a new mobile search site designed specifically for use on Apple's iPhone and iPod Touch devices. Some background information was provided by Yahoo developer Ryan Grove on his blog.

Grove explained that he wanted to make as few concessions and compromises as possible when it came to bringing Yahoo search to the iPhone. So the new mobile site comes with some goodies, too: SearchAssist, to auto-complete inquiries; compatibility with Yahoo's SearchMonkey widgetized results (if you're logged in), and search shortcuts like weather, movie times, and local information.

Early reviews seem to be positive. "Thanks for finally giving me the reason to get one of those damn iPhones," a commenter on Grove's blog wrote.

Hitachi announces a new storage services platform

| 0 comments |

Hitachi Data Systems on Friday introduced Universal Storage Platform V (USP V), an all-new storage services platform with a 3.5 million input output operations per second (IOPS) of maximum performance. The platform increases virtualized storage port performance for external storage by up to 500 percent over its predecessor.

“The USP V takes performance to another level,” said Tony Asaro, Senior Analyst, Enterprise Strategy Group. “This is also innovation at its best—improving the architecture of a storage system to raise its performance at nearly all levels—in leaps. Performance is not discussed as often as it should be. If your applications don’t perform well, then your business suffers.”

Hitachi also announced its 4 Gb/s Fiber Channel Switch backplane in an enterprise-class storage platform which provides customers with a fast and cost-effective way to process and transfer data through a storage controller engine.

Hitachi continues to set the bar for enterprise-class, controller-based, heterogeneous storage virtualization-supporting common storage services,” said Carl Greiner, Senior Vice-president, Infrastructure and Software, Ovum. “The announcement of the USP V begins to render any controller performance or scalability issues mute. Virtualization-enabled dynamic provisioning allows storage utilizations to exceed 85 percent, delivering unique economies to storage infrastructures. This announcement most definitely takes storage virtualization to a new level.”

The Hitachi Universal Storage Platform V facilitates synergistic linkages across a broad spectrum of enterprise, mid-range and low-end storage systems, delivering unified, advanced storage services that span multi-dimensional virtualization, provisioning, partitioning, and replication capabilities.

Intel Nehalem - Not built for gaming

| 0 comments |

IDF has started and the first benchmarks of Nehalem are going to start popping up. It is without a doubt an impressive architecture with a much better platform to run on, but this CPU is not about giving you better frames per second in your favorite game than the Penryn family. Let me make that more clear: even when the GPU is not the bottleneck, it is likely that most games will not be significantly faster than on Penryn. We, the people behind it.anandtech.com will probably have the most fun with it, more than your favorite review crew at Anandtech.com :-). And no, I have not seen any tests before I type this. Nehalem is about improving HPC, Database, and virtualization performance, and much less about gaming performance. Maybe this will change once games get some heavy physics threads, but not right away.

Why? Most Games are about fast caches and super integer performance. After all, most of the Floating point action is already happening on the GPU. The Core 2 CPUs were a huge step forward in integer performance (not the least because of memory disambiguation) compared to the CPUs of that time (P4 and K8). Nehalem is only a small step forward in integer performance, and the gains due to slightly increased integer performance are mostly negated by the new cache system. In a previous post I told you that most games really like the huge L2 of the Core family. With Nehalem they are getting a 32KB L1 with a 4 cycle latency, next a very small (compared to the older Intel CPUs) 256KB L2 cache with 12 cycle latency, and after that a pretty slow 40 cycle 8MB L3. When running on Penryn, they used to get a 3 cycle L1 and a 14 cycle 6144KB L2. The Penryn L2 is 24 times larger than on Nehalem!

So is Intel bringing out a quad-core for the IT and HPC crowd? Well, in a way I think the company needs to. While quad-core processors aren’t a fringe product any more, the problem is that we’re living in an era where dual-core is good enough and most software doesn’t deliver when it comes to scaling on a quad arrangement. At the moment no one buys a quad-core to make Microsoft Word run faster. Even gaming, an area that could well benefit from a couple of extra cores, hasn’t been revolutionized by quad-core CPUs.

Core i7

My take on this is that over the next few years we’re going to see more and more software be optimized for 4 and more cores. Games, I think, are going to need to be built from the ground up to do this (or at least the underlying engines are). This is going to take time, and before game studios put in the effort they’re going to want to see more quad core parts in the hands of gamers of all levels (casual gamers as well as the hardcore enthusiasts).

However, in the short term at any rate, if i7 does end up disappointing gamers this may well leave a gap in the market that AMD could take advantage of. Rather than concentrating on just the CPU, AMD is doing a pretty good job of marketing platforms and here AMD could gain an edge on Intel, especially if it can squeeze more frames per second out of games through streamlining the whole platform.


Wi-Fi in cable set-top boxes coming

| 0 comments |

Cable TV providers are planning to add wireless to their set-top boxes, according to panelists at a recent Connect event sponsored by Park Associates and the CEA.

A report in EE Times say cable companies like Cox Communications are planning to deliver signals to home via coaxial cable, and then add a wireless link for shuffling the signal throughout the home. Cox is in the early stages of planning its network architecture, but an official says that it is likely to include multimedia over coax as well as Wi-Fi.

Unfortunately, you’ll have to wait about two years for this solution.

Other panelists noted that more TV makers are starting to include Wi-Fi in their sets. HP has sold a MediaSmart 802.11g TVs for a couple of years now, and Westinghouse will ship in the fall an HDTV that uses ultrawideband chips from Pulse~Link that now support HDMI.

Another company, Celano, said it has engineered 802.11 chips that can deliver up to four HD video streams up to 120 feet. It should work with any Wi-Fi client and is up for Wi-Fi certification now.

Source:http://blogs.zdnet.com/

BlackBerry Bold debuts in North America

| 0 comments |

The long-awaited BlackBerry Bold, or BlackBerry 9000, made its North American debut Thursday on Rogers Wireless in Canada.

The Bold is a souped-up version of the BlackBerry Curve model, which has sold well in the U.S. on all four major carriers. This new and improved BlackBerry is supposed to have a superior screen resolution to the Curve and more memory. It also has Wi-Fi and operates over a 3G network, something the GSM version of the Curve does not do. Rogers, like AT&T and T-Mobile, which offer the Curve in the U.S., is a GSM carrier.

But all the new bells and whistles will cost consumers a pretty penny, which might make it out of reach for the consumer customers Research in Motion hopes to attract. Rogers will sell the phone for about $400 with a three-year voice and data plan.

Many have compared the Bold to Apple's iPhone 3G, even going as far as to call it RIM's iPhone killer. But even though the Bold offers an improved full HTML Web browser and an improved screen resolution for watching video, it does not have a touch screen. Like previous BlackBerry models, it has a standard QWERTY keypad and BlackBerry's special track ball for navigation.

That said, the real BlackBerry iPhone killer will likely be the BlackBerry Thunder, which is supposed to have a touch screen. The Thunder is expected to be released later this year.

The Bold has already been available in other countries including, Germany, Austria, Turkey, Chile and Ecuador. But Rogers is the first North American carrier to get the phone. The phone is expected to launch in the U.S. in September. And a CDMA version of the phone is also expected.

Source:http://news.cnet.com

IBM, University of Toronto To Build Canada's Most Powerful Supercomputer

| 0 comments |

In a joint effort, IBM and The University of Toronto SciNet consortium are creating Canada's most powerful supercomputer, capable of processing up to 360 trillion calculations per second and storing 60 times more data than the Library of Congress Web archive.

The supercomputer is expected to be among the top 20 fastest supercomputers in the world, 30 times faster than the peak performance of Canada's current largest research system, and the second largest system ever built on a university campus, according to IBM. The project will be started immediately, and is slated to be fully operational by summer 2009.

Big Blue said that the supercomputer will pioneer a hybrid design containing two systems that can work together or independently, connected to a massive five petabyte storage complex.

The machine is extremely flexible and capable of running a wide range of software at a high level of performance since it uses IBM's iDataPlex system and IBM's POWER6 architecture, according to the company.

As the largest implementation of IBM's iDataPlex system, the supercomputer will hold twice as many processors per unit as standard systems and is entirely water cooled. More than 4,000 servers will be linked together in the multi-platform solution, including one of the world's largest POWER6 clusters and Intel x86-based clusters. The supercomputer will also be one of the first systems to use future Intel Nehalem processor families, slated for release early next year.

The supercomputer will be used for research in aerospace, astrophysics, bioinformatics, chemical physics, climate change prediction, medical imaging and the global ATLAS project, which is investigating the forces that govern the universe. The supercomputer will also be used to explore why matter has mass and what constitutes the mass of the universe.

An immediate project will be the construction of regional climate change predictions for the province of Ontario and Great Lakes watershed region.

The five-year project is estimated to cost $47 million, including construction and operating costs, Reuters reported. Funding has been provided by the Canadian Foundation for Innovation's National Platforms Fund, in partnership with the Province of Ontario and the University of Toronto.

Intel Releases USB 3.0 Interface Specifications

| 0 comments |

Intel has released a draft specification revision 0.9 for the Extensible Host Controller Interface (xHCI). The revision, released Wednesday, comes in support of USB 3.0 architecture, also known as SuperSpeed USB. The draft specification provides a standard method for USB 3.0 host controllers to communicate with the USB 3.0 software stack.

One important factor in adopting SuperSpeed USB products is interoperability between multiple devices from different manufacturers. The xHCI draft specification revision 0.9 aims to make interoperability easier to implement, while also making it easier for developers to create software support for the market.

AMD, which along with Nvidia had complained in the past that Intel was not releasing a revision for the USB 3.0 specification, supported the move in a statement that was included in Intel's news release. "Lifestyles filled with HD media and digital audio demand quick and universal data transfer. USB 3.0 is an answer to the future bandwidth need of the PC platform. AMD believes strongly in open industry standards, and therefore is supporting a common xHCI specification," said Phil Eisler, AMD corporate vice president and general manager of the Chipset Business Unit.

Dell, Microsoft and NEC also supported the release.

"Dell welcomes the availability of Intel's xHCI specification because it provides a single interface standard that will expedite the industry transition to next-generation USB 3.0," said Rick Schuckle, Dell client architecture strategist. "This interface standard is important to ensure that our customers have interoperable USB 3.0 systems, devices and software drivers."

"Microsoft has developed driver support for the USB industry standard since its inception and is committed to supporting the latest hardware technologies on the Windows platform," said Chuck Chan, Microsoft general manager of Windows Core OS. "Microsoft intends to deliver Windows support for hardware that is compliant with the xHCI specification; this is a huge step forward in enabling the industry and our customers to easily connect SuperSpeed USB devices to their PCs for exciting new functionality and usages."

Intel already has plans on making another revision specification -- xHIC 0.95 -- available in the fourth quarter of 2008.

Source:http://www.crn.com

Intel Sticks With 'Core' Competency For Nehalem Branding

| 0 comments |

Intel made it official Sunday—the upcoming Nehalem microarchitecture "tock" will retain the successful Core brand with the first Nehalem processors for PCs to be called Core i7, including an "Extreme Edition" chip due out before the end of the year, according to Intel.

Intel's metronomic "tick-tock" technology road map includes die shrinks like last year's move from 65nm to 45nm—the "tick"—and major microarchitecture changes, or "tocks," like Nehalem. Intel already stuck with the Core brand once before, when the Santa Clara, Calif.-based chip giant opted to retain the brand name for its Core 2 architecture, the previous "tock" before Nehalem.

Intel Core i7 processors, all of them quad-core devices according to Intel, won't be the only Nehalem products, however.

"This is the first of several new identifiers to come as different products launch over the next year," Intel said in a statement.

Nehalem's changes to the Core 2 microarchitecture include the integration of the memory controller on the die, a key part of Intel rival Advanced Micro Device's architecture for its PC and server processors. Intel calls its integrated memory controller technology QuickPath.

Intel will also offer Hyper Threading with Nehalem, giving each core two threads or eight-thread support overall, as well as a new cache subsystem.

The first Core i7 product is a quad-core Extreme Edition device, the chip maker confirmed without giving more details beyond stating it would be in production "in the fourth quarter of this year" and will carry a black logo.

But several reports state the product in question is a 3.2GHz processor that Intel will sell for $999. Other Core i7 chips reported to be in production by the fourth quarter are 2.66GHz and 2.93GHz products.

AMD Puts Nvidia On Notice With 4870 X2 Graphics Card

| 0 comments |

Advanced Micro Devices had been building to this moment since the early summer release of two new 4800 series graphics cards, and on Tuesday unveiled the whopper -- the ATI Radeon HD 4870 X2, which features two GPUs on a single card, boasts two full gigabytes of memory and delivers 2.4 teraflops of processing power, according to AMD.

It's "the world's fastest graphics card," according to AMD, which has priced the initial shipments at $549. The "ultra-enthusiasts" targeted by the 4870 X2 now have a choice between AMD's very best and discrete graphics leader Nvidia's edge video card -- the 1GB GeForce GTX 280, which could be had Tuesday on Newegg.com at prices in the $450 range.

AMD can now legitimately claim to have an answer to Nvidia's top-shelf products for probably the first time since the Sunnyvale, Calif.-based chip maker acquired ATI Technologies in 2006. If market share follows, it would put a much-needed positive spin on a merger that has thus far garnered far more headlines for the hurt it's put on AMD's financials than for the market's embrace of AMD-ATI products.

AMD is certainly banking on a major win with its 4800 series of products. In addition to the 4870 X2, the $399 ATI Radeon HD 4850 X2 was released Tuesday. In late June, AMD unveiled its first product in the 4800 series, the sub-$200 Radeon HD 4850 and followed up a week or so later with the more powerful 4870, priced at around $300.

The 4800 X2 series, until recently known by the code name R700, builds on the dual-GPU technology AMD developed with its Radeon HD 3800 X2 products, which also come in two flavors, the 3850 X2 and 3870 X2. The two 4800 X2 cards get their cross-GPU performance boosted from technology based on the PCIe Generation 2 standard. AMD continues to lead the industry with its 55nm process technology for graphics chips and remains ahead of the game in supporting DirectX 10.1.

LG Electronics 42LG60 (Scarlet) 42-inch LCD HDTV

| 0 comments |


Lots of thought has gone into LG's chic "Scarlet" 42-inch HDTV, as evident in everything from the faux-leather remote control to the useful and brilliantly navigable OSD (on-screen display) menu. However, at $2600 (as of July 11, 2008), such niceties don't come cheap.

The entire back of the LCD panel is red, so if you look at the TV from an angle, you see some nice red accents. Personally, though, I see little point to the color, given that you rarely gaze at the back of your big-screen TV. The bezel is thin on the top and sides, but the bottom is about three times thicker; this is where LG hides the unit's superior-sounding, down-firing speakers. As nice as it is not to see speaker grilles, I found the extra-large bottom bezel distractingly unattractive, especially considering that everything else about the display looks stunning.

A number of advanced features come standard. Individual six-color controls are easily found in the Expert Control level of the picture menu. And with a single click of the remote, you can find and adjust many accurately calibrated presets, from Sports mode to Movie mode.

LG also includes an Intelligent Sensor setting in this model. Most sensors just measure the brightness of ambient light in the room. LG's version uses a complex set of algorithms to measure not just brightness but also contrast, color, sharpness, and white balance. The feature worked well most of the time, but on one occasion it briefly garbled some images while constantly trying to adjust the settings to match its changed surroundings.

The set performed well enough to earn a performance score of Good in our PC World Test Center lab tests. The Westinghouse TX-42F430S, however, received the same performance score and costs about $1000 less than the top-shelf Scarlet. Still, LG's high price tag nets you good image quality, great menu options, and extra ports such as USB (and you can use the USB port to play music or view photos from any USB drive.)

Should you be lucky enough to add an LG Scarlet to your living room, you won't be disappointed.

Source:http://www.pcworld.com

Olympics Boost Mobile TV

| 0 comments |

Many Chinese unable to catch the Olympics on television will watch national hurdling hero Liu Xiang retain his 110 meter crown next week by simply switching on their cellphone.

That, at least, is the dream outcome for the backers of mobile TV, for whom the Games are a golden opportunity to burnish the reputation of a medium that has failed to live up to its potential since it was launched in 2004.

"For certain events, the most important thing is to learn the result instantly," said Yun Weijie, president and chief executive of Telegent Systems, a Silicon Valley semiconductor maker.

"The quality of the images doesn't matter sometimes," he said. "That's exactly the case with mobile TV and the Olympics."

Telegent produces chips that let cellphones receive TV signals free of charge. By the end of 2007, the firm's chips were in use in five million handsets throughout Asia, the Middle East, Europe and Africa, Yun said.

China accounted for half of the total.

"TV will become a standard feature for cellphones in China by the end of this year, just like cameras," Yun said.

Watching TV on a cellphone is already routine in Japan and South Korea, auguring well for the industry's prospects in China.

But mobile TV in China has long been criticized for a lack of eye-grabbing content and bandwidth restrictions that have left viewers frustrated waiting for their screen to light up and -- as Yun admitted -- disappointed by the poor quality of the image.

But things are changing.

In April, China Mobile, the country's largest mobile operator, launched trials of third-generation (3G) mobile services based on TD-SCDMA, a home-grown standard, in eight cities, including Beijing.

Eager to show off 3G, China Mobile has bought some 40,000 mobile TV-enabled handsets and is handing them out to staff and guests for the Games.

ZTE Corp, China's No. 2 telecoms network gear maker, won a contract to provide about 8,000 of the phones.

"The original plan was to distribute the phones after the games," a ZTE official said. "However, China Mobile decided to do it before the games kicked off, because they think the development of mobile TV technology has already reached a satisfactory level."

The official declined to be named as he is not authorized to speak to the media.

Growing Fast

China had over 600 million registered mobile phone users as of June, by far the largest market in the world.

Only 12 million, or two percent, of the users currently subscribe to mobile TV, bringing in 4.6 billion yuan ($670 million) of revenue a year, according to CCID Consulting in Beijing.

But the private consulting firm reckons both subscribers and revenue could grow tenfold by 2012.

Studies by In-Stat China, a high-tech market research firm headquartered in Arizona, show that over 60 percent of existing cellphone users are interested in mobile TV.

"Nowadays, people's attention and time is segmented, so they want multi-functional converged handsets," said Kevin Li, In-Stat China's telecoms research director.

The cost of receiving TV is low, if not free, and television is traditional family entertainment, Li noted. "So copying TV content to mobile phones is attractive to many people," he said.

Among foreign investors seeking to tap the China market is Sheikh Sultan Al-Qasimi, chairman of Gulf Holdings. The United Arab Emirates firm has already invested in mobile TV in six countries, mostly in southeastern Asia.

One of Al-Qasimi's investments, Movaio Pte Ltd in Singapore, has forged a partnership with China Teleformation, a mobile TV content provider in China.

For Al-Qasimi, the attraction of mobile TV in China is that it is largely protected from swings in the economic cycle.

"If you have a boom, people'll have money to spend, so you have customers," he told Reuters on the sidelines of a recent conference in Beijing. "When you have a downturn, you have more customers, because they have nothing to do."

Source:http://www.pcworld.com

HDTV News:Big Screen TV Sales Soon to Slow?

| 0 comments |


South Korea's LG Electronics Inc said last week the flat screen television market was poised for much slower growth in 2009, but maintained high growth targets for its own plasma and liquid crystal products.

"We definitely expect a slowdown in the growth of the TV market," said Simon Kang, president of LG's display division, which makes plasma displays and flat screen TVs.

Kang also told reporters LG expected significant declines in TV prices in the second half as falling panel prices allow manufacturers to cut costs while competition heats up.

Despite the bleak outlook, Kang said LG aims to raise its North American market share in liquid crystal display (LCD) TVs from 7-8 percent in 2007 to 12-13 percent by the end of 2008, compared to a goal of "more than 10 percent" given earlier this year.

The executive also said LG would not follow bigger rivals Sony Corp and Samsung Electronics Co Ltd into a price war in the key North American market.

"If you compete in the low end you end up damaging your brand and your profitability," Kang said.

Kang said the recent Euro 2008 soccer tournament and the ongoing Beijing Olympic games had "no effect whatsoever" on television sales, contrary to the lofty hopes expressed by the industry earlier this year.

"Because of the economic slowdown, the Chinese market has been very difficult."

LG's display division contributed 29 percent of the company's total sales in the second quarter -- but less than 5 percent of overall operating profit.

The division posted a modest 1 percent operating profit margin in the second quarter, which was still a vast improvement from the 5 percent loss margin posted in the year-ago period.

Keeping Plasma

Kang maintained his company's steadfast commitment to plasma screens, a technology that has fallen out of favor after bigger and more prolific LCD companies invaded the market for large-size displays.

After a brief recovery earlier this year, triggered by a shortage of LCDs, plasma screens are now hanging on for dear life amid heightened competition in the flat screen market along with the worldwide slowdown.

Kang said his division now aimed to produce only 4.2 million plasma modules for 2008, down from an initial yearly target of 6 million units set at the beginning of the year.

"There is still a lot of room to improve profitability," said Kang. "Our current position is that we are keeping the plasma business."

Microsoft Silverlight 2 Beta 2 arrives

| 0 comments |

Microsoft has now released Silverlight 2 Beta 2 for WindowsMac Intel (6.65MB). The Silverlight homepage does not yet have the new download links, but as things change, this post will be updated. Just like previous Silverlight betas, the installation is very quick and the browser may have to be restarted for the changes to take effect. Silverlight 2 Beta 2 is available for (browser versions in beta are not officially supported):

  • Internet Explorer 7 on Windows Vista and XP SP2
  • Internet Explorer 6 on XP SP2 and Windows 2000
  • Firefox 2 on Windows Vista and XP SP2
  • Firefox 1.5 on Vista and XP SP2
  • Safari on Mac OS 10.4.8+ (Intel-based)

There are supposed to be numerous changes and new features in this version; here’s a quick rundown of the major ones:

  • TabControl has been added to the SDK. Text wrapping and scrollbars for the TextBox and Autosize, Reorder, Sort and so on have been added for the DataGrid.
  • Many controls have moved from the SDK (the application) to the runtime and others have been brought in line with their Windows Presentation Foundation (WPF) counterparts. Developers targeting both Silverlight and WPF for their applications should enjoy these changes.
  • Templating for controls has been simplified with the introduction of Visual State Manager.
  • Cross Domain is further improved over beta 1, the WebClient has been updated to allow uploads, and duplex communications (’push’ from server to SL) have been added.
  • Deep Zoom has received a major update: the file format is now XML instead of a binary format, so the ability to generate your own Deep Zoom images and collections server side will become easier. There’s also some nice new event models around zoom/pan state.

Developers will want to check out the SDK documentation which was uploaded yesterday. For a more detailed list of changes, check out the full changelog posted by Silverlight Product Manager David Pugmire. The final version of Silverlight 2 is still targeted for a “late summer” release.

Update

Scott Guthrie, Corporate Vice President of the Microsoft Developer Division, has finally posted on his blog about Silverlight 2 Beta 2. Also, the following applications have been updated to support Silverlight 2 Beta 2 and have been posted on the Microsoft Download Center:

Microsoft: Vista feature designed to ‘annoy users’

| 0 comments |

A Microsoft manager has said that one of the security features in Vista was deliberately designed to “annoy users” to put pressure on third-party software makers to make their applications more secure.

David Cross, a product unit manager at Microsoft, was the group program manager in charge of designing User Account Control (UAC), which, when activated, requires people to run Vista in standard user mode rather than having administrator privileges, and offers a prompt if they try to install a program.

“The reason we put UAC into the (Vista) platform was to annoy users–I’m serious,” said Cross, speaking at the RSA Conference here Thursday. “Most users had administrator privileges on previous Windows systems and most applications needed administrator privileges to install or run.”

Cross claimed that annoying users had been part of a Microsoft strategy to force independent software vendors (ISVs) to make their code more secure, as insecure code would trigger a prompt, discouraging users from executing the code.

“We needed to change the ecosystem,” said Cross. “UAC is changing the ISV ecosystem; applications are getting more secure. This was our target–to change the ecosystem. The fact is that there are fewer applications causing prompts. Eighty percent of the prompts were caused by 10 apps, some from ISVs and some from Microsoft. Sixty-six percent of sessions now have no prompts,” said Cross.

Cross claimed it is a myth that users just turn UAC off, saying that Microsoft had collected opt-in information from users that showed that 88 percent were running UAC. Cross said it was also a myth that users blindly accept prompts without reading them.

“It’s a myth that users click ‘yes,’ ‘yes,’ ‘yes,’ ‘yes,’” said Cross. “Seven percent of all prompts are canceled. Users are not just saying ‘yes.’”

Security company Kaspersky has severely criticized UAC, claiming in March last year that it would make Vista less secure than Windows XP.

At this year’s RSA Conference, however, the security specialist seemed to have changed its tune. With Windows, “there is a large attack surface with a number of entry points,” said Jeff Aliber, Kaspersky’s U.S. senior director of product marketing. “Anyone trying to shrink that attack surface and promote secure apps development has to be a good thing.”

Prior to the launch of Vista, Kaspersky issued a report in January 2007 that said UAC would be ineffectual. The company claimed that many applications perform harmless actions that, in a security context, can appear to be malicious. As UAC flashes up a warning every time such an action is performed, Kaspersky said that users would be forced to either blindly ignore the warning and allow the action to be performed or disable the feature to stop themselves from going “crazy.”

Laptop makers adopt 3G Gobi chipset

| 0 comments |

Dell, Hewlett-Packard, and Lenovo are to incorporate Qualcomm’s Gobi chipset into their laptops later this year.

Gobi, which Qualcomm released in October 2007, is a chipset that allows travelers to connect to both High-Speed Packet Access (HSPA) and Evolution-Data Optimized (EV-DO) networks. Both are types of “super-3G” but are incompatible.

HSPA is used in Europe and much of the rest of the world, while EV-DO is used in North America and parts of Australasia.

The disparity between HSDPA and EV-DO networks has led to a situation where, despite data-roaming agreements between companies such as Vodafone (in the U.K.) and Verizon (in the U.S.), a subscriber to either operator is forced to switch data cards if traveling between the regions.

“The Gobi solution enables enterprise users and consumers with the freedom of being untethered from Wi-Fi hot spots and connecting to the Internet using ‘almost anywhere’ cellular broadband connectivity,” Greg Raleigh, vice president of product management for Qualcomm CDMA Technologies, said last week. “We are pleased that Dell will be (using) the flexibility and efficiency Gobi provides to meet the growing needs of mobile data users.”

Ken Bond, Dell’s director of wireless product management, said the move would allow the laptop manufacturer to address the needs of “customers (who) are demanding more freedom to compute the way they want, where they want.”

Secret recipe inside Intel’s latest competitor

| 0 comments |

It works like an Intel chip, but looks like the Cell processor.

That’s one way of describing the energy-efficient multiple core processors being devised by secretive Montalvo Systems. The Santa Clara, Calif.-based company has come up with a design for a chip for portable computers and devices that–when finished and manufactured–will theoretically be capable of running the same software as chips from Intel or Advanced Micro Devices.

Montalvo’s chips, however, will fundamentally differ from the latest Core or Opteron processors from Intel and AMD in that the cores on its chip won’t be symmetrical, i.e. identical to each other. Instead, Montalvo’s chips will sport a mix of high-performance cores and lower-performance cores on the same piece of silicon, similar to the Cell chip devised by IBM, Toshiba, and Sony, according to sources close to the company.

By merging asymmetrical cores onto the same piece of silicon, Montalvo can cut power consumption by dishing applications that don’t require a lot of computing firepower onto less-powerful, more energy-efficient cores. Applications could conceivably also be shuttled to low-power cores after their need for high-performance elapses: Microsoft Outlook, for instance, requires a burst of performance during the launch phase but far less once it’s running.

Asymmetrical cores can also provide better performance on applications such as video if programmed for that purpose, say proponents of the architecture. The Cell processor became the first chip to successfully champion this idea. The Cell consists of a primary microprocessor core and an array of “synergistic processing elements” that can be programmed to perform discrete tasks like managing networking or video streaming.

Cell chips have primarily been used inside Sony’s PlayStation 3, but IBM has inserted Cell chips in some server blades. Toshiba plans to put the chip inside TVs and may put it inside PCs. (While the initial Cell comes with eight synergistic cores, chips can be made with fewer.) Mercury Computer Systems has also adopted Cell for some computers.

Montalvo has not stated whether it has adopted an asymmetrical core to save power, boost performance on media applications, or both. In fact, the company doesn’t say anything at all. The closest it has come to a public statement are shirts handed out to employees saying that the company can’t say what it is up to. Montalvo declined to comment for this story.

The somewhat different, asymmetrical nature of Montalvo’s chip in part helps explain why investors have put more than $73 million into the Sisyphean task of taking on Intel. Montalvo wants to land its chips into all sorts of portable computers: notebooks, handheld devices such as the OQO, and ornate smartphones. Several companies, however, have tried this and failed because of the daunting nature of trying to compete against Intel. Cyrix, Transmeta, Rise–none of them ever lived up to its advance billing. Only AMD has survived, and AMD has lost more money that it has made in its 30-year plus existence.

Montalvo is funded by people who’ve tangled or been entangled with Intel before too. NEA-IndoUS’s Vinod Dham, who sits on Montalvo’s board, was one of Intel’s chief chip architects during the Pentium era. He then went to NexGen, which designed an Intel-compatible chip, and then AMD when it bought NexGen.

Montalvo’s CEO is Matt Perry, who also served as chief executive of Transmeta, which once tried to take on Intel in notebooks but now largely concentrates on technology licensing. Peter Song, Montalvo’s chief architect, earlier founded a company called MemoryLogix, which tried to build low-power Intel-compatible chips.

VIA Nano Takes on Intel’s Atom

| 0 comments |

VIA’s line of CPUs, developed by its Centaur subsidiary, have focused on delivering “enough” performance at very low power. The Centaur line is also very low cost, due to tiny die sizes relative to Intel and AMD CPUs.

Rather than take Intel head-on, however, VIA’s approach has been to develop a complete platform, based around its tiny motherboards, which are suitable for low power, embedded applications. Some of the more recent products, like the C7 line, have also garnered design wins in low cost “mini-note” laptops and ultra-mobile PCs.

However, the VIA CPUs were also derided as being low performance. “Good enough” was probably good enough for very light duty office applications, “nettop” PCs, and dedicated, embedded applications where power and form factor was critical.

Glenn Henry, Centaur’s chief architect, decided it was time to flex a little design muscle and develop a processor that offered performance good enough to compete in a more mainstream environment.

The VIA Nano architecture certainly has the right ingredients. The CPU is an out-of-order, superscalar design with a substantially beefed-up floating point unit. It is, however, still a very lean design, eschewing enhancements such as simultaneous multithreading. The Centaur team did build in the hooks to build multicore versions of the Nano, but the first CPUs off the fab lines will be single core processors.

Alas, time and technology march on. Intel has adopted the low power religion, shipping iterations of the Silverthorne architecture. Now known as Atom, Intel’s new CPU seems like a throwback: An in-order design that sacrifices advanced architectural features in order to minimize power usage.However, Atom does have one ace up its sleeve: simultaneous multithreading (SMT), which Intel calls Hyper-Threading. That turns out to be pretty important, as we’ll shortly see.

Via makes way for 64-bit chips

| 0 comments |

Via Technologies is making processors based on a new architecture this year that may help the tiny company inch up the chipmaking pecking order.

The chips, which utilize the so-called Isaiah architecture, are expected to provide double the performance of the company’s current chips but consume the same amount of power. They will come with two cores and run at 2 gigahertz.

The first Isaiah chips will make their debut toward the middle of the year. Via announced the architecture in 2004, but it has now released the fuller specifications.

Via occupies only a sliver of the market, but it has managed to land a few interesting design wins with its low-power chips. Hewlett-Packard has used Via chips in some computers sold in China, while Samsung Electronics and Oqo have put Via processors into handheld computers. Many thin-client makers also buy processors from the Taiwanese company.

For Via, the new processors sport a few firsts. For one thing, the chips can process instructions out of order, something chips from Intel and Advanced Micro Devices have done for years. This enables the chip to keep churning while waiting for crucial data.

To date, Via has stuck with in-order execution to keep power consumption low.

“With out-of-order execution, you can do things while waiting. The bad news is that you execute things that later get thrown away” and hence consume more power than necessary, said Glenn Henry, president of Centaur Technology, which is Via’s processor design subsidiary.

The chips will also be capable of processing 64-bit software. AMD has had 64-bit chips since 2003. Intel came out with so-called x86 chips for desktop and notebooks that can process 64-bit software a few years later.

Although 64-bit chips have been out for years, few consumers or even business users actually use 64-bit software on their desktops and notebooks. The several delays to Microsoft’s Windows Vista operating system hurt the evolution of a 64-bit market.

Nanosensors for Medical Monitoring

| 0 comments |

Physicians often test the levels of a few telltale blood proteins in seriously injured or ill patients to detect organ failure and other problems. Now Vista Therapeutics, a startup based in Santa Fe, NM, hopes to improve the care of these patients with sensitive devices for continuous bedside monitoring of such blood biomarkers. Instead of taking daily snapshots of the patient's levels of blood proteins, the company's nanosensors should allow for continuous monitoring of changes that occur over periods of only a few hours.

Spencer Farr, CEO of Vista Therapeutics, says that the first application of the technology will be for careful monitoring of patients whose status can change rapidly--such as those in the ICU after suffering a heart attack or traumatic injuries from a car accident. "We envision having a branch in the patient's IV that tests continuously or every five to ten minutes," says Farr. The nanowires are sensitive enough that they should be able to detect trace biomarkers that diffuse into the IV line from the blood. After a car wreck, for example, patients could be closely monitored for molecular warning signs of impending kidney and other organ failure.

To make the detectors, Vista Therapeutics has licensed nanowire sensing technologies developed by Harvard University chemist Charles Lieber. Silicon nanowires, semiconducting wires as thin as two nanometers, have what Lieber calls the "ultimate sensitivity," even with completely unprocessed samples such as blood. When a single protein binds to an antibody along the wire, the current flowing through the wire changes. Arrays of hundreds of nanowires, each designed to detect a different molecule in the same sample, can be arranged on tiny, inexpensive chips. The changes can be monitored continuously as molecules bind and unbind, making it possible to detect subtle trends over time, without requiring multiple blood draws.

The standard protein-detection technique, ELISA, is very sensitive but, Farr says, takes 90 minutes to perform. It starts with a blood draw that must be extensively processed--first to purify the proteins, then to label them with fluorescent dyes--and then tested with expensive imaging equipment in a hospital lab. "ELISA is a powerful technology for one-time measurements," says Farr, "but there's no existing technology for continuous biomarker measurement."

The sensitivity of nanowire detectors should also open up the possibility of finding new biomarkers. The blood biomarkers that doctors routinely test for--including prostate-specific antigen for cancer screening and c-reactive protein, a sign of heart failure--can be monitored with ELISA because their levels change over days or weeks. Because nanowire sensors allow for extremely sensitive, continuous monitoring, they should allow doctors to monitor the levels of blood proteins and other molecules whose concentration changes over a much shorter timescale. Changes in these biomarkers are currently undetectable. "We expect we'll be able to include those that change rapidly, peaking within a matter of a few hours," says Farr. Because it hasn't been practical to make such measurements before, it's not clear just what these biomarkers will be, but Farr hopes that Vista will uncover them.

Initially, Vista will market clinical devices for monitoring known biomarkers in IV lines. In the future, the company might develop implantable chips for patients with chronic diseases such as diabetes. A nanowire chip in an artery in the wrist might continuously monitor blood glucose and proteins indicative of early liver damage and other diabetic complications. The device could send alerts to a wristwatch. Because nanowires are so sensitive and inexpensive, they could also find their way into home tests for cancer, where early detection is key, says Farr.

A Cool Fuel Cell

| 0 comments |

A new electrolyte for solid-oxide fuel cells, made by researchers in Spain, operates at temperatures hundreds of degrees lower than those of conventional electrolytes, which could help make such fuel cells more practical.

Jacobo Santamaria, of the applied-physics department at the Universidad Complutense de Madrid, in Spain, and his colleagues have modified a yttria-stabilized zirconia electrolyte, a common type of electrolyte in solid-oxide fuel cells, so that it works at just above room temperature. Ordinarily, such electrolytes require temperatures of more than 700 °C. Combined with improvements to the fuel-cell electrodes, this could lower the temperature at which these fuel cells operate.

Solid-oxide fuel cells are promising for next-generation power plants because they are more efficient than conventional generators, such as steam turbines, and they can use a greater variety of fuels than other fuel cells. They can generate electricity with gasoline, diesel, natural gas, and hydrogen, among other fuels. But the high temperatures required for efficient operation make solid-oxide fuel cells expensive and limit their applications. The low-temperature electrolyte reported by the Spanish researchers could be a "tremendous improvement" for solid-oxide fuel cells, says Eric Wachsman, director of the Florida Institute for Sustainable Energy, at the University of Florida.

In a solid-oxide fuel cell, oxygen is fed into one electrode, and fuel is fed into the other. The electrolyte allows oxygen ions to migrate from one electrode to the other, where they combine with the fuel; in the simplest case, in which hydrogen is the fuel, this produces water and releases electrons. The electrolyte prevents the electrons from traveling directly back to the oxygen side of the fuel cell, forcing them instead to travel through an external circuit, generating electricity. Via this circuitous route, they eventually find their way to the oxygen electrode, where they combine with oxygen gas to form oxygen ions, perpetuating the cycle.

The electrolyte--which is a solid material--typically only conducts ions at high temperatures. Santamaria, drawing on earlier work by other researchers, found that the ionic conductivity at low temperatures could be greatly improved by combining layers of the standard electrolyte materials with 10-nanometer-thick layers of strontium titanate. He found that, because of the differences in the crystal structures of the materials, a large number of oxygen vacancies--places within the crystalline structures of the materials that would ordinarily host an oxygen atom--formed where these two materials meet. These vacancies form pathways that allow the oxygen ions to move through the material, improving the conductivity of the materials at room temperature by a factor of 100 million.

The material is still some way from being incorporated into commercial fuel cells. For one thing, the large improvement in ionic conductivity will require further verification, Wachsman says, especially in light of the difficulty of measuring the performance of extremely thin materials. Second, the direction of the improved conductivity--along the plane of the material rather than perpendicular to it--will require a redesign of today's fuel cells. What's more, the limiting factor for the temperature in fuel cells now is the electrode materials. Before room temperature solid-oxide fuel cells are possible, these will also need to be improved.

Yet if initial results are confirmed by future research, the new materials will represent a significant advance. Ivan Schuller, a professor of physics at the University of California, San Diego, says that this represents a major change in the performance of electrolytes. He adds, "It will surely motivate much new work by others."

Drawing Circuits with Nano Pens

| 0 comments |


The demand for ever faster, cheaper electronics is pushing the lithography-based manufacturing techniques standard in the semiconductor industry to their limits. Now researchers report a cheap, fast lithography technique that uses arrays of flexible polymer nano pens to precisely pattern millions of complex structures in parallel. The technique, which the researchers have used to create an integrated circuit (and lilliputian versions of the Olympics logo), can be employed to make lines whose sizes range from a few nanometers to millimeters thick.

The technique, developed by Chad Mirkin, a chemist at Northwestern University and director of the International Institute for Nanotechnology, uses arrays of pyramid-shaped polymer pens whose tips are dipped in solutions of chemicals that may feature almost any molecule, including proteins and acids; the pens are then traced over a surface by a mechanical arm to create millions of structures in parallel. The width of the lines drawn by each pen can be carefully controlled by varying the force exerted on the flexible pen tips. Because Mirkin's pens trace out designs programmed by computer software, they can quickly switch between complicated designs, making possible the creation of complex patterns whose features are very close together.

Mirkin has used the pens to pattern acid on a silicon wafer coated with gold; he then etched, based on the pattern, a gold integrated circuit. Polymer-pen lithography also shows promise for patterning biological molecules. Indeed, says Mirkin, the technique could work with almost any molecular "ink," including proteins for capturing and studying cells. The arrays of polymer pens cost less than a dollar each to make.

Polymer-pen lithography is an improvement over dip-pen lithography, a technique that Mirkin has been developing since 1999. Dip-pen lithography uses arrays of sharp, stiff cantilevered probes--the same ones used for atomic force microscopy. Mirkin created a company, NanoInk, to commercialize the technology. But, he acknowledges, "its ultimate utility has been limited by problems with throughput, cost, and complexity." The size of its molecular strokes has been restricted to a relatively narrow range, the cantilevers are prone to breaking, and the number of structures that can be made in parallel is limited.

"If this works," says Grant Willson, an engineer at the University of Texas at Austin, "it will speed the process" of patterning structures with nano pens. The new version of dip-pen lithography could make the technology much more commercially practical. But Mirkin's technique will be competing in a crowded field, notes Willson. Researchers aiming to pack circuits with ever smaller features for ever faster chips are taking many different nanofabrication approaches. Some, for example, are creating optical antennas to focus light into very small beams to extend the capabilities of photolithography. Others have turned to beams of electrons or ions, or use heat deformation to form patterns.

Harald Fuchs, director of the Interface Physics Group at the University of Münster, in Germany, says that the major advantage of Mirkin's technique over other nanofabrication methods is precision and flexibility. The pens could be used to write a pattern in one molecular ink, get dipped in another, and then write another layer. To make even more complex patterns, says Fuchs, each pen tip could be dipped in a different ink.

Mirkin says that Northwestern is talking to companies, including his own NanoInk, about commercializing polymer-pen lithography. The technique, he says, will make the dip-pen technology "accessible to a large number of people."

Source:www.technologyreview.com

Bringing Invisibility Cloaks Closer

| 0 comments |

In an important step toward the development of practical invisibility cloaks, researchers have engineered two new materials that bend light in entirely new ways. These materials are the first that work in the optical band of the spectrum, which encompasses visible and infrared light; existing cloaking materials only work with microwaves. Such cloaks, long depicted in science fiction, would allow objects, from warplanes to people, to hide in plain sight.

Both materials, described separately in the journals Science and Nature this week, exhibit a property called negative refraction that no natural material possesses. As light passes through the materials, it bends backward. One material works with visible light; the other has been demonstrated with near-infrared light.

The materials, created in the lab of University of California, Berkeley, engineer Xiang Zhang, could show the way toward invisibility cloaks that shield objects from visible light. But Steven Cummer, a Duke University engineer involved in the development of the microwave cloak, cautions that there is a long way to go before the new materials can be used for cloaking. Cloaking materials must guide light in a very precisely controlled way so that it flows around an object, re-forming on the other side with no distortion. The Berkeley materials can bend light in the fundamental way necessary for cloaking, but they will require further engineering to manipulate light so that it is carefully directed.

One of the new Berkeley materials is made up of alternating layers of metal and an insulating material, both of which are punched with a grid of square holes. The total thickness of the device is about 800 nanometers; the holes are even smaller. "These stacked layers form electrical-current loops that respond to the magnetic field of light," enabling its unique bending properties, says Jason Valentine, a graduate student in Zhang's lab. Naturally occurring materials, by contrast, don't interact with the magnetic component of electromagnetic waves. By changing the size of the holes, the researchers can tune the material to different frequencies of light. So far, they've demonstrated negative refraction of near-infrared light using a prism made from the material.

Researchers have been trying to create such materials for nearly 10 years, ever since it occurred to them that negative refraction might actually be possible. Other researchers have only been able to make single layers that are too thin--and much too inefficient--for device applications. The Berkeley material is about 10 times thicker than previous designs, which helps increase how much light it transmits while also making it robust enough to be the basis for real devices. "This is getting close to actual nanoscale devices," Cummer says of the Berkeley prism.

The second material is made up of silver nanowires embedded in aluminum. "The nanowire medium works like optical-fiber bundles, so in principle, it's quite different," says Nicholas Fang, mechanical-science and -engineering professor at the University of Illinois at Urbana-Champagne, who was not involved in the research. The layered grid structure not only bends light in the negative direction; it also causes it to travel backward. Light transmitted through the nanowire structure also bends in the negative direction, but without traveling backward. Because the work is still in the early stages, it's unclear which optical metamaterial will work best, and for what applications. "Maybe future solutions will blend these two approaches," says Fang.

Making an invisibility cloak will pose great engineering challenges. For one thing, the researchers will need to scale up the material even to cloak a small object: existing microwave cloaking devices, and theoretical designs for optical cloaks, must be many layers thick in order to guide light around objects without distortion. Making materials for microwave cloaking was easier because these wavelengths can be controlled by relatively large structural features. To guide visible light around an object will require a material whose structure is controlled at the nanoscale, like the ones made at Berkeley.

Developing cloaking devices may take some time. In the short term, the Berkeley materials are likely to be useful in telecommunications and microscopy. Nanoscale waveguides and other devices made from the materials might overcome one of the major challenges of scaling down optical communications to chip level: allowing fine control of parallel streams of information-rich light on the same chip so that they do not interfere with one another. And the new materials could also eventually be developed into lenses for light microscopes. So-called superlenses for getting around fundamental resolution limitations on light microscopes have been developed by Fang and others, revealing the workings of biological molecules with nanoscale resolution using ultraviolet light, which is damaging to living cells in large doses. But it hasn't been possible to make superlenses that work in the information-rich and cell-friendly visible and near-infrared parts of the spectrum.

First All-Nanowire Sensor

| 0 comments |

Researchers at the University of California, Berkeley, have created the first integrated circuit that uses nanowires as both sensors and electronic components. With a simple printing technique, the group was able to fabricate large arrays of uniform circuits, which could serve as image sensors. "Our goal is to develop all-nanowire sensors" that could be used in a variety of applications, says Ali Javey, an electrical-engineering professor at UC Berkeley, who led the research.

Nanowires make good sensors because their small dimensions enhance their sensitivity. Nanowire-based light sensors, for example, can detect just a few photons. But to be useful in practical devices, the sensors have to be integrated with electronics that can amplify and process such small signals. This has been a problem, because the materials used for sensing and electronics cannot easily be assembled on the same surface. What's more, a reliable way of aligning the tiny nanowires that could be practical on a large scale has been hard to come by.

A printing method developed by the Berkeley group could solve both problems. First, the researchers deposit a polymer on a silicon substrate and use lithography to etch out patterns where the optical sensing nanowires should be. They then print a single layer of cadmium selenide nanowires over the pattern; removing the polymer leaves only the nanowires in the desired location for the circuit. They repeat the process with the second type of nanowires, which have germanium cores and silicon shells and form the basis of the transistors. Finally, they deposit electrodes to complete the circuits.

The printed nanowires are first grown on separate substrates, which the researchers press onto and slide across the silicon. "This type of nanowire transfer is good for aligning the wires," says Deli Wang, a professor of electrical and computer engineering at the University of California, Santa Barbara, who was not involved in the research. Good alignment is necessary for the device to work properly,since the optical signal depends on the polarization of light, which in turn is dependent on the orientation of the nanowires. Similarly, transistors require a high degree of alignment to switch on and off well.

Another potential advantage of the printing method is that the nanowires could be printed not only onto silicon, but also onto paper or plastics, says Javey. He foresees such applications as "sensor tapes"--long roles of printed sensors used to test air quality or detect minute concentrations of chemicals. "Our next challenge is to develop a wireless component" that would relay the signals from the circuit to a central processing unit, he says.

But for now, the researchers have demonstrated the technique as a way to create an image sensor. They patterned the nanowires onto the substrate to make a 13-by-20 array of circuits, in which each circuit acts as a single pixel. The cadmium selenide nanowires convert incoming photons into electrons, and two different layers of germanium-silicon nanowire transistors amplify the resulting electrical signal by up to five orders of magnitude. "This demonstrates an outstanding application of nanowires in integrated electronics," says Zhong Lin Wang, director of the Center for Nanostructure Characterization at Georgia Tech.

After putting the device under a halogen light and measuring the output current from each circuit, the group found that about 80 percent of the circuits successfully registered the intensity of the light shining on them. Javey attributes the failure of the other 20 percent to such fabrication defects as shorted electrodes and misprints that resulted in poor nanowire alignment. He notes that all of these issues can be resolved with refined manufacturing methods.

The researchers also plan to work toward shrinking the circuit to improve resolution and sensitivity. Eventually, says Javey, they want everything on the circuit to be printable, including the electrodes and contacts, which could help further reduce costs.

Magnets Capture Cancer Cells

| 0 comments |

Magnetic nanoparticles coated with a specialized targeting molecule were able to latch on to cancer cells in mice and drag them out of the body. The results are described in a study published online this month in the Journal of the American Chemical Society. The study's authors, researchers at Georgia Institute of Technology, hope that the new technique will one day provide a way to test for--and potentially even treat--metastatic ovarian cancer.

"It's a fairly novel approach, to use magnetic particles in vivo to try to sequester cancer cells," says Michael King, an associate professor of biomedical engineering at Cornell University, who was not involved in the study.

With ovarian cancer, metastasis occurs when cells slough off the primary tumor and float free in the abdominal cavity. If researchers could use the magnetic nanoparticles to trap drifting cancer cells and pull them out of the abdominal fluid, they could predict and perhaps prevent metastasis. Although the nanoparticles were tested inside the bodies of mice, the authors envision an external device that would remove a patient's abdominal fluid, magnetically filter out the cancer cells, and then return the fluid to the body. After surgery to remove the primary tumor, a patient would undergo the treatment to remove any straggling cancer cells. The researchers are currently developing such a filter and testing it on abdominal fluid from human cancer patients.

"It's possible that the particles may not ever have to go into the patient's body," says John McDonald, chief scientific officer of the Ovarian Cancer Institute at Georgia Tech and a senior author of the paper. "That would be preferable, because then you don't have to worry about any potential toxicity."

The particles, which are just 10 nanometers or less in diameter, have cobalt-spiked magnetite at their core. Most of the time they are not magnetic, but when a magnet is present, they become strongly attracted to it. On the surface of the particles is a peptide--a small, proteinlike molecule--designed to attach to a marker that protrudes from most ovarian cancer cells.

To test the new technology, the researchers injected first cancer cells and then the magnetic nanoparticles into the abdominal cavities of mice. The cancer cells were tagged with a green fluorescent marker, and the nanoparticles with a red one. When the team brought a magnet near each mouse's belly, a concentrated area of green and red glow appeared just under the skin, indicating that the nanoparticles had latched on to the cancer cells and dragged them toward the magnet.

While this experiment showed that the nanoparticles could snag at least some cancer cells within the body, it's not yet clear what proportion of cancer cells were captured and removed. Tests to pinpoint that proportion are planned.

Cornell's King suspects that the technology may be better suited to diagnosing, rather than treating, metastasis. "I think that this technology has much more potential for diagnostics and for detecting cancer cells," he says. "I'm not fully convinced that it could be used to really significantly filter out cancer cells as a therapy."

A similar technology that uses antibody-coated beads to separate out cancer cells has already proved effective in vitro, but the new study's authors believe that the magnetic nanoparticles will be less likely to cause an unwanted immune response and are thus better suited for use inside the body. And because they seem to bind more strongly than antibodies to their targets, says McDonald, they may be better able to pull out cancer cells.

"The ideal would be to try to get everything, but I doubt that would happen," says McDonald. "But we believe that we could significantly reduce the number and thus lower the probability of metastasis."

For now, the treatment seems uniquely suited to ovarian cancer; most other tumors metastasize through cells floating in the bloodstream rather than in the abdominal fluid. But eventually, the team hopes to adapt the particles for use in blood, perhaps extending their use not only to other cancer types, but also to viral diseases such as HIV. To do so, say the researchers, they will need to develop highly specific targeting molecules for each disease to ensure that healthy blood cells are spared.

To test the feasibility of using the nanoparticles in the bloodstream, Ken Scarberry, a graduate student at Georgia Tech and coauthor of the study, reports watching them in action in an artificial circulatory system that passed under a fluorescent microscope. When a magnet was placed near the microscope's lens, "you could see that all of the cells immediately got sequestered over to the side and did not move as the fluid continued to flow," says Scarberry. "This technology has so many possibilities. Right now, I think we're just scratching the surface with it."

Source:www.technologyreview.com

A Secret Tool for the U.S. Swim Team

| 0 comments |

Around the time that the swimwear company Speedo was calling on NASA scientists to help create the now famous LZR Racer suit--an enhanced skin that many people credit for more than a dozen world records broken by swimmers so far this week in Beijing--a scientist in New York began working on a different tool for the swimmer's armory. Over the past five years, Tim Wei, a mechanical and aerospace engineer at Rensselaer Polytechnic Institute, has revamped an established technique in fluid dynamics to study human movement for the first time. The method allows scientists and coaches to study how fast and hard a swimmer pushes the water as she moves through it. Swim coach Sean Hutchison, who put two athletes on the Olympic swim team, says that he used Wei's insights as the basis for every technical change he made with swimmers leading up to the Olympic trials and games this year.

Wei uses a tracking technique called digital particle image velocimetry, commonly used to measure the flow of small particles around an airplane or small fish or crustaceans in water. For water-based flow experiments, researchers pour minute silver-coated beads into water and illuminate them with a laser. A high-speed digital video camera tracks the downstream flow of beads over the creature. "But ramping up to large scales is hard," says biologist Frank Fish, who studies the propulsion of aquatic mammals at West Chester University and has collaborated with Wei on dolphin studies. "Shining lasers on swimmers and immersing them in water full of glass beads may be asking them to go above and beyond in the name of science."

Wei devised a novel solution: instead of glass beads, he filtered compressed air in a scuba tank through a porous hose to create bubbles about a tenth of a millimeter in diameter. An athlete swims through a sheet of bubbles that rises from the pool floor, and a camera captures their flow around the swimmer's body. Images show the direction and speed of the bubbles, which Wei then translates into the swimmer's thrust using software that he wrote. "More force equals faster swimming," he says.

In collaboration with Hutchison, who coaches elite athletes outside Seattle, Wei filmed Olympic gold medalist Megan Jendrick and more junior swimmer Ariana Kukors in a flume swimming breaststroke, which has a froglike kick. Jendrick's velocity vectors signaled a fast speed, and they pointed straight out from the bottom of her feet. This meant that her feet threw water behind her, thrusting her forward, much the way that an ice skater who throws a ball will shoot herself in the opposite direction. By comparison, Kukors, a less experienced elite swimmer, had slower vectors that ran parallel to her feet, which meant that she slid through the water.

"[Hutchison] took that and modified the breaststroke kick of all his elite athletes," says Wei, who presented his work to USA Swimming, the sport's governing body, in 2007. In a sport in which shaving tenths of a second can be cause for celebration, Hutchison reported that by adapting her kick, Kukors dropped several seconds in a breaststroke event, although she just missed the Olympic team. Jendrick and another of Hutchison's swimmers, Margaret Hoelzer, are competing this week at the games, where Jendrick placed fifth in the 100-meter breaststroke and Hoelzer, who won a bronze in the 100-meter backstroke, hopes to win gold in the 200 back. She broke the world record in the event in July.

More recently, Wei has turned his attention to a swimmer's thrust. With funding from USA Swimming, Wei built a force balance, an upside-down triangular frame that acts like a bathroom scale. Swimmers lie outstretched in the water and kick into the frame, and it measures their propulsion over time. The output, which for an elite swimmer like Kukors showed a sinusoidal, repetitive wave, can help coaches determine whether an athlete should try to generate more force with a harder, bigger kick rather than a shallower, quicker one. "It depends on the individual swimmer," says Wei, who hopes to combine flow and thrust measuring tools into one image. He also wants to make more measurements of athletes swimming freely, rather than pushing against a wall or in a flume.

Wei will meet with USA Swimming biomechanics coordinator Russell Mark in the fall to talk about what to do next. "Russell's job is providing coaches with a sound physics base for whatever they tell swimmers to do," Wei says. USA Swimming also relies on computer-based flow analysis using whole-body scans of swimmers; these could be combined to determine how one validates the other.

Source:www.technologyreview.com


Get a faster "3D fix" with GPS satellites

| 0 comments |

Originally developed in 1973 by the US Department of Defense for military purposes, the Navstar GPS network consists of 24 satellites orbiting the earth every 12 hours and five ground stations that monitor the satellites' position in space and operational status. In order to accurately determine your location and other data such as current and average speed, directional heading, and elevation, GPS devices use a receiver to acquire signals from at least four of these satellites. This is known as a 3D fix, and it's why GPS antennas require an unobstructed view of the sky to work correctly.

Armed with your precise latitude, longitude, and other location data, the GPS receiver can overlay this information onto map files stored on the unit, revealing your current position on the map as well as where you've been. Since the receiver is constantly recalculating your position relative to the satellite's position, the GPS unit can track your location in real time. A typical GPS device contains a 12-channel receiver and an antenna to capture satellite signals, as well as a CPU to process the data. The quality of the receiver and your geographic location will determine how long it takes the device to acquire a 3D fix. For example, it's harder for the receiver to lock onto and hold a signal if you're travelling through a dense forest or an urban area with tall buildings.

The first time you fire up your GPS, it collects certain satellite information to determine your whereabouts. In this state known as a cold start, the receiver is essentially blank and needs to know what time it is, where the satellites are in their orbital patterns, and where it is in relation to the satellites. Most systems take around 30 to 45 seconds to acquire a 3D fix during a cold start, while others can take several minutes. Thereafter, it can take as little as three to four seconds to lock in since the unit already has your coordinates and a general location of the satellites. A good receiver will instantly recover from a complete signal loss when you drive through a tunnel, for instance, while weaker units will require more time to reacquire a 3D fix. In some cases, you'll have to stop the car to give the receiver a chance to lock on to the requisite signals.

How well a GPS unit works in your car depends on the location of the antenna. If your vehicle has a factory installed in-dash unit, chances are the antenna is integrated into the dashboard in a place where it has an unobstructed view of the sky, which is ideal. Many portable models are designed to be positioned directly on the windshield via a suction-cup-mounting device, giving the antenna a wide sky view. There are also add-on antennas available for GPS units that allow you to keep the receiver close to the front seat for easy viewing without sacrificing signal quality.

GPS headsets make sure the cows come home

| 0 comments |


From the plains of southern New Mexico, we bring you a story of headset-wearing cows. The U.S. Department of Agriculture and the Massachusetts Institute of Technology are teaming up to remotely corral cattle using a wireless device that sends sound straight into the bovines' ears. HDTV-watching pigs can't be far behind.

The solar-powered "Ear-A-Round" is a naugahyde "helmet" held in place by the cow's ears. Atop the holster sits an electronics device hooked to sound-transmitting stereo earphones and containing a GPS unit that could let farmers monitor the animals' whereabouts from afar.

"It's a marriage between biology and electronics," said USDA research animal scientist Dean M. Anderson, who has been collaborating with MIT on the project for the last several years, but has focused on the concept of "directional virtual fencing (PDF)" for more than three decades.

"When I started, the letters 'GPS' meant nothing to me," Anderson said. "But...animal distribution on the landscape has been an age-old challenge. With free-ranging animals, you get areas on the landscape that are overused and other areas that are underused."

The patent-pending device is scheduled to be tested on about nine cows later this month at the USDA's Jornada Experimental Range in Las Cruces, N.M. Anderson noted that not all cattle in a given location will need to be fitted with the instrument--only herd leaders. The animals that will participate in the early testing are currently undergoing a sort of "IQ test for cows" that will identify herd leaders in that group, the researcher said.

The animals will be hearing a range of noises in their ears as they graze--from recordings of Anderson's voice offering such encouragement as, "C'mon girls, let's go," to the sounds of all-terrain vehicles sometimes used in lieu of horses to gather animals.

"Using familiar sounds...this is the key," Anderson told . "Animals can be trained. It doesn't have to be the voice of an individual. It could be things as strange as train whistles or other types of audio cues."

If the sound cues don't work, the device can emit a small electrical shock to move cows in the desired direction.

Intel 'Turbo Memory' tries to speed up Windows

| 0 comments |


Intel's newest version of Turbo Memory is trying to do what Windows doesn't do: transparently optimize Windows for flash memory storage.

At the Flash Memory Summit in Santa Clara, Calif., Intel will be demonstrating its latest version of Turbo Memory based on flash memory to accelerate application performance in Windows.

Intel is offering a "dashboard" for Windows that allows the user to choose and control which applications or files are loaded into the Intel Turbo Memory cache (based on flash memory chips) for performance acceleration. Intel calls this "User pinning."

Custom pinning profiles can be created to pin applications or files that match the user's activity, according to Intel. Data intensive programs, gaming, digital media editing and productivity software are examples of applications that will see the most benefit, according to Intel.

Intel is trying to address a longstanding shortcoming of Windows: its inability to take full advantage of flash storage devices. "There are issues related to taking full advantage of the speed of a (flash drive)," said Troy Winslow, marketing manager for the NAND Products Group at Intel, in an interview at the Flash Memory Summit.

Avi Cohen, managing partner at Avian Securities, said he believes this should be an innate part of the operating system. "The more interesting way is to have it built into the operating system," said Cohen. "I don't think it gains much traction because I don't think users want to sit there and start selecting what goes where," he said. "It was a valiant effort by Intel to accelerate the move toward solid state on PC," Cohen added.

Winslow, however, said that Intel "has shipped million of units" of Turbo Memory and that he expects some notebook makers to integrate it into high-end lines.

Interestingly, Windows Vista does have a feature called "ReadyBoost" that can "use storage space on some removable media devices, such as USB flash drives, to speed up your computer," according to Microsoft documentation. This documentation can also be found in "Windows Help and Support" as part of any copy of Vista.

"When you insert a compatible device, the AutoPlay dialog box will offer you the option to speed up your system using Windows ReadyBoost," the Microsoft documentation says.

In related news, Intel announced a new Z-P230 PATA (Parallel ATA) SSD drive that comes in 4 gigabyte (GB) and 8GB capacities, with a 16GB version following in September. Pricing is $25 for the 4GB version for 1,000 unit quantities and $45 for 1,000 unit quantities for the 8GB version.

Robopanda: Headed for Extinction

| 0 comments |


This animatronic panda is a real bear to unpack and operate, and the toy doesn't offer young kids much playful interaction

Hong Kong toy company WowWee bills its new Robopanda as an interactive playmate that lets kids hear stories, play games, and engage with the animatronic bear by touching sensors on its body and paws. It's an intriguing pitch. But the toy's slipshod construction, repetitive programming, and arduous packaging made me want to ship this panda back to the preserve.

WowWee's stock in trade is cyborg-like toys that combine aspects of man, machine, and beast. Among them are a robotic dragonfly, an eerie singing Elvis bust, and Robosapien, an android that can walk and grasp objects. Robopanda, priced at $150, is the latest addition to the lineup, and like other WowWee toys, it features expressions and behaviors that users can trigger by touching it or moving it through space.

But getting this panda to sit up or stand often resulted in the toy toppling over, an outcome likely to frustrate the children who make up the toy's intended audience. (WowWee recommends the bear for kids age 8 and up.) Playing with the panda can get tiresome—its litany of stories and games prompt kids to make choices by touching a paw or foot, but beyond that, there aren't many other means for interaction. The toy's spoken instructions can be opaque; how many children are likely to decipher Robopanda's hint that "you can upgrade my memory using the cartridges"? And parents likely won't be any happier with the cumbersome and sometimes perilous setup process.

Setup Is a Real Trial

Removing the toy from its box required unwinding tightly wrapped wire in 11 places, untangling it from the toy's joints, then slicing through five sharp plastic tabs to free the Robopanda and its companion teddy bear from their imprisonment. Crudely applied tape at first obscured most of this tangle from view. A bit more attention to packaging would save much cursing and brow-dampening for the unlucky parents who have to open the box.

That's just the beginning of the setup ordeal. Installing the toy's 10 batteries requires removing and then retightening six small screws on the panda's back and feet. The instructions aren't much better. Playing with the toy requires inserting one of two program chips and moving a switch in back to one of three different "play modes," a process the manual describes on page 4. But figuring out just what those modes are ("training," a panda-led guide to features; "friend," story-telling mode; and "menu," an inexplicably named game-playing routine) requires flipping ahead to pages 9 and 10.

An Unstable Bore

Once I got all the batteries and the chip installed, I gave Robopanda's storytelling and game-playing modes a whirl. The black-and-white plastic toy stands 19 inches tall and has touch sensors in its paws, legs, belly, back, and head. Robopanda can tell whether it's picked up or put down and registers vocal responses to its questions. It even occasionally pitches forward and crawls on the floor when it's feeling exploratory.

But I found Robopanda's transitions between sitting and standing awkward; the toy often fell when it was supposed to change positions. And its interactive stories often ask little of the listener. For instance, as Robopanda tells of his experience traveling to China in a crate to visit other pandas, I was prompted merely to touch a paw to indicate I wanted to continue. I could imagine smart kids quickly tiring of the routine. Overall, playing with this panda proved to be a real bear.

Blog Archive