A sign that you’re a true geek – you don’t name your car, but you name your computers. Someone shared a cartoon on Google+ recently of a guy naming his gear. I can no longer find that cartoon, but it did inspire me to name my computers and some other gear this weekend. On a Mac, this is in the Sharing section of System Preferences. On your iDevices, it is in the About section of the Settings app,
Somehow, gear seems to have more personality when you name it. My gear is named it after ski runs at Big Sky, Montana. My workhouse MacBook is no longer the bland “Evan’s MacBook,” but is now the hardworking “Iron Horse.” My old iMac is no longer the boring “Evan’s iMac,” but is the reliable and most senior “Papa Bear.” My media server is “Hollywood,” and so on. Read more
As a follow up to my recent post on the World’s Most Awesome Automated Filing System, I intended to write a post on how to get your Windows-only ScanSnap Scanner working on your Mac. This would have been important to those of you who switched from Windows to Mac, and wanted to use your old scanners. Until recently, Fujitsu, the maker of the ScanSnap, created an artificial distinction between their Mac and Windows scanners. The hardware was identical, which should have meant that as long as you had the correct driver for your system, either scanner should have worked on your machine. Unfortunately, Fujitsu built a check into their drivers, so that a Mac would see that you had the Windows-branded version of the ScanSnap, and not be able to use the scanner. This was an incompatibly cooked up out of thin air by Fujitsu. As much as I’m a huge fan of the ScanSnap line, this had the stench of an attempt to create more sales. Fortunately, those days appear to be over.
40Tech is pleased to present this guest post by Kyle from hpinkcartridges.com.
3D printing at the moment is slowly becoming a more publicly available technology. In the not so distant past the technology was only really used by big companies in industries such as engineering to create prototypes, models, etc, but within the last few years there has been a big increase in public availability with a number of cheaper 3D printers appearing on the market.
Bobby recently reviewed a laptop stand that claims to protect against the effects of WiFi, but we were skeptical since there was no way to prove or disapprove those claims. We’re still skeptical, and I have no plans to buy the stand, but chalk a recent MSNBC article (published 3 days after Bobby’s review) up to strange coincidence. According to the article, a study from Argentinian scientists found that the electromagnetic radiation generated during wireless communications caused harm to sperm in a laboratory study. But before you panic, read on for some of the details.
Remember IBM? They may not be the premier computer manufacturer they once were, but the world’s oldest computer company is still skating on the cutting edge of technology. Their latest achievement? Two DARPA SyNAPSE (Systems of Neuromorphic Adaptive Plastic Scalable Electronics) funded prototype chips that can learn and remember in a way that “begins to rival the brain’s function, power, and speed.”
Can you say Skynet?
Ok, so we’re nowhere near AI-driven world takeover, but thinking computer chips that require very little in the way of size or power to operate is definitely a step in the bright direction. What these chips really represent is a shift in the way computers process information. They move away from the traditional Von Neumann architecture that relies on processing and memory to work separate to each other. Both chips have 256 neurons, with one chip containing the programmable synapses, and the other, learning synapses that can “remember and learn from their own actions.”
IBM’s end goal is to create a shoebox-sized chip/brain that has some 10 billion neurons and 100 trillion synapses that runs on 1 kilowatt-hour of power (the human brain has 150 trillion synapses and needs about 20 watts). Eventually, they are looking to create cognitive computers that can take detailed input from multiple sources at the same time, process it and make a decision based on its own experiences as well as its programming. IBM has just finished phases 0 and 1 of the project and have already been experimenting with machine vision, associative memory, patterns recognition, navigation, and more.
Like I said, don’t take a hammer to your computer and networked devices just yet — the fun’s only just beginning. These brain chips could herald a whole new age of computing.