The trusty old credit card is getting a major update — but is our country ready?

You may be familiar with “chip and PIN” cards, those high-tech credit cards that actually contain a small microchip. Traditional credit cards use a magnetic stripe to process payments, but this technology is inherently problematic. Between dubious card reader practices and learning to forge a signature, they simply don’t offer consumers a high level of protection. The chip and PIN cards were developed in the early 2000s to put an end to some of these common security complaints.

The chip is more secure because it can’t be cloned. This, combined with the wireless technology of the past ten or so years, has made it possible to process payments with a secure card directly in front of the consumer. No more taking credit cards to a back room, far out of sight, where potential danger lies.

President Obama signed an executive order to make the move to chip and PIN cards within the government starting in January 2015. There’s a strong push to get the rest of the country to switch over by October — a mere 6 months away. Unfortunately, US businesses have been slow to adopt this new technology. The problem has to do with infrastructure — the innumerable credit card machines found in practically every establishment across the states — and less to do with the credit card companies themselves.

So do you need to get a chip card?

Sooner or later, yes. But don’t drop what you’re doing just yet. Since not all retailers accept them, it’s advisable to keep a traditional credit card handy just in case. But if you value the extra protection the chip has to offer — and we hope you do — talk to your credit card company about issuing you a new card. You’ll be ahead of the game, and have more secure finances because of it.