RFID chips are everywhere - companies and labs use them as access keys, Prius owners use them to start their cars, and retail giants like Wal-Mart have deployed them as inventory tracking devices. Drug manufacturers like Pfizer rely on chips to track pharmaceuticals. The tags are also about to get a lot more personal: Next-gen US passports and credit cards will contain RFIDs, and the medical industry is exploring the use of implantable chips to manage patients. According to the RFID market analysis firm IDTechEx, the push for digital inventory tracking and personal ID systems will expand the current annual market for RFIDs from $2.7 billion to as much as $26 billion by 2016.

RFID technology dates back to World War II, when the British put radio transponders in Allied aircraft to help early radar system crews detect good guys from bad guys. The first chips were developed in research labs in the 1960s, and by the next decade the US government was using tags to electronically authorize trucks coming into Los Alamos National Laboratory and other secure facilities. Commercialized chips became widely available in the '80s, and RFID tags were being used to track difficult-to-manage property like farm animals and railroad cars. But over the last few years, the market for RFIDs has exploded, driven by advances in computer databases and declining chip prices. Now dozens of companies, from Motorola to Philips to Texas Instruments, manufacture the chips.

The link for this article located at Wired is no longer available.