Do you remember when cell phones were for rich people? It’s only a short jump in my memory to the day when a homeless kid got angry at me when I told him I didn’t have any change; convinced I must be lying because I was carrying a mobile phone. You know, back when they still kind of looked like phones, and Nokia was king? I felt bad for the guy, but I really was broke. I got the cell phone on credit, could barely pay for the bill, and was having many a fight with the company over false charges.
This ramble isn’t to point out that cell phone companies were crooks, even back then, and it’s not to talk about my questionable technology-money choices. The point is that this was only a few years back. I was in my early 20’s — I’m only in my mid-30’s now — and have gone from having no computer, an unused email address, and the blissful (and retrospective) peace of not knowing or caring where people were or what they were doing, to being a geek tech-blogger that makes his living in online marketing and communications. I own an iPhone, my hold-out wife has finally gotten an Android, and my three-year-old owns my iPad — and regularly sends me artwork via email.
Tech is Hungry
Technology is now in the palms of tiny little hands. It’s affordable, or at least readily available, to the majority of the planet, and it’s entire weight of purpose seems to be to interconnect everyone and everything as fast and as in depth as possible. The flow of information has reached truly epic proportions, as has the ability and desire to track that flow, along the habits of the people drowning in it.
The technology behind this phenomenon feeds upon itself, and in many cases, it exists only to further itself. Some of the biggest blogs out there are only so popular because people need a filter; a place to better understand, control, and find some sense of order in the massive technology machine — redundant as that phrase may seem. Smaller blogs exist for the same reason. It was likely part of why Evan started 40Tech, why I joined him, and why you are reading this post right now.
Facebook is a prime example of the direction of technology. It’s sole purpose is to become familiar and intricately entwined with as much of your life as possible. It attempts to augment your life; make it easier, faster, more connected. It’s addictive. Facebook is so successful at this that it has become embedded in the general populace to the extent that it can almost be perceived in the same way as a governing body. It creates rules that dictate our way of life, is an easy target for privacy concerns and conspiracy theories, and the smallest changes can lead to virtual revolt and widespread public outcry. Facebook, much like many of the governments out there, projects an image of a body that wants to further mankind; make the world we live in a better place and all that. And like many governments, it’s more than a bit of a stretch for most people to really believe that’s true.
Facebook isn’t going anywhere, either — not without a scandal that shakes the entire foundation of their business to the core, or a hostile takeover by a frightened government or technological superpower. With some of the things in the media regarding questionable privacy practices and the rapid expansion of Google+, those things may not seem so far-fetched, but even if the big bad were to happen to the social media giant, it would probably just morph, as opposed to vanish.
Social connectivity is a way of life for us now; whether we like it or not, and no matter the anxiety, stress, or fun disorders it could cause or amplify. It appeals to the voyeur in us. It allows us to meet people we would otherwise never meet, and keep in touch with people to a degree that would be impossible without it. It is a part of work, school, play, business, entertainment, and everyday, mundane life. For Pete’s sake, your washing machine can already contact you to let you know your laundry is done, and there are tweeting dog collars, man!
Bring on the Microchips!
Over the next 10-20 years, unless the “social media bubble” or end of days people are right, we will likely find ourselves micro-chipped, QR-coded, and surfing the web while jogging with augmented reality sunglasses that also allow us to huddle with our families, friends, or business contacts on GoogleBook. Don’t ask me how they will take our video — somebody else will figure that out, I’m sure. That is, of course, unless we are all suffering from wifi, cellular, and bluetooth radiation poisoning, which could bring the world to a screeching and potentially catastrophic halt that would make Y2K fears look like a happy day at the park.
Or maybe we’ll be busy ripping the fabric of the universe apart with time machines. Did you hear that Albert Einstein may have been wrong? Some scientists at CERN, near Geneva, may have just recorded neutrinos that were travelling faster than the speed of light. That might disprove the Theory of Relativity and screw up one of the major fundamentals of modern physics. Learned that on Google+, I did… And I’ll be sharing it on Facebook, too.
One might think that the world wide web, which is still predominantly text-based, would be the spearhead in the rise to new heights of literary articulation. Unfortunately, if you were the one who was thinking that, you were sadly misinformed. In actuality the ability or willingness to write with proper grammar and spelling has been replaced by a general acceptance of a lower standard. The acceptance appears general, that is. Where do you stand on the subject?
I confess that poor spelling and grammar is a pet peeve of mine. When I am reading something that runs rampant with glaring errors, I find it difficult, irritating, and that the work loses credibility in my eyes. There are levels, however. While misspellings like “definately” and “loose” (for lose) always make me cringe a bit, I make allowances for posts and comments that have mistakes in them. I recognize that, while English is the most prominent language on the web (at least in my own experience), many of the active participants of the social and interactive super-real-time web are not native speakers (or writers). If I were to have to communicate in other languages, I have no doubt that my writings could easily be the stuff laughing stocks are made of.
Where I draw the line, however, is with “texting” or “IM” style writing. Some of that has its place, too — or had, before the mass adoption of full hardware and software keyboards — but forgive me if I think that there is never a good excuse to write “wat” in place of “what.” That’s almost enough to get me to stop reading altogether. I also can’t stand l33t. Practically unheard of for a tech-geek, I know, but the secret code of elite nerds always struck me as a really annoying oxymoron.
Now before those that are inclined start tearing apart some of the grammatical inconsistencies of this post, I should mention that I am ok with conversational writing. That is to say that I don’t mind some liberties being taken to convey tone and flow that, on some level, emulates how two friends or acquaintances would talk with one another. In fact, I think that sort of writing is essential on the web. It is part of what makes a blog post resonant, and helps the reader and writer to identify with one another. How far I’m willing to accept this style of writing is dependent on the subject matter, the points I made above, and quite likely, my age/maturity level while reading. And I’m fully aware and accepting of the fact that my maturity level can fluctuate… :P
Is my acceptance of even a limited degradation of writing on the web part of the overall problem? Probably. Is it one of the factors that leads to established journalists getting lazy (and sometimes disappointing) with their writing? Again, probably. It’s all tied in with other factors like language barriers and the attention deficit fostering speed of the online world. Is there a line that should be drawn in the sand somewhere, though? Should people be publicly flogged for ignoring the oh-so-convenient spell-check integrations out there? Personally, I think that spell-check, auto-correct, and especially auto-complete are actually part of the problem. Since I started using the iPhone and iPad, for example, I have noticed a marked increase in mistakes while typing on a full keyboard — especially with contractions.
What about you? Where do you draw the line — or do you care at all? Why?
Ok, look — I don’t know if these are real scientists or what, but a bunch of song and musical-style parody videos where the subject is crazy science concepts and gear that even I (in my extreme geekiness) know almost nothing about, well… let’s just say I was too busy giggling to find out. And that was one hell of a run-on sentence – but hey! It’s another long weekend up here in Canada, and when I get a long weekend post, I like to loosen up the reigns a bit.
Seriously, though, if you want something that will boggle your brain, and maybe make you laugh to the point where your loved ones and co-workers look at you funny (not talking about myself here… really!), then you need to take a look at these!
Today, 40Tech is pleased to present you with a guest post by Jaelithe.
Everyone you know has one—everyone except for you. The mere mention of the shiny rectangle has your IT guy cussing under his breath. One look at the interface of this phone and it’s obvious it was designed for the consumer but that doesn’t mean it can’t do the heavy-lifting too. So what has your IT department so worked up?
Sensitive Corporate Data
In 2007 the iPhone stomped onto the cellular scene with huge technical advances and major curb appeal but it lacked some critical security features for IT departments to jump on the bandwagon. At first, the iPhone didn’t support the encryption of user data and didn’t have a solution to remotely wipe data clean in the event the phone was lost or stolen. Enterprise fraud management is a huge IT concern and becomes ever bigger if your IT department has to be concerned with you and the guy you left your iPhone next to on the bus. In addition, many IT departments achieve corporate goals with third-party applications and office suites (the iPhone wasn’t supporting them yet). Apple quickly responded, adding support for third-party apps and the ability to interact with Exchange servers. The memory of the first iPhone’s limited capabilities echoes in the minds of IT professionals everywhere—it could take a while for them to shake off the stigma.
Your company doesn’t want to pay for you to take photos in the bathroom of your abs (or other ridiculousness). The iPhone has a myriad of fun, snazzy features but companies don’t want to pay for you to take pictures for Facebook, play Angry Birds, or watch YouTube videos featuring cats jumping out of boxes. It’s critical to IT departments that they’re able to customize the features and define settings on the device in order to effectively manage compliance with the company’s acceptable-use policy. Apple is now delivering solutions to administrators.
In 2007 there were fewer apps that applied to serious business folks but now there’s a never-ending myriad of apps available specifically engineered to support business objectives. As Apple provides more and more solutions, it will be difficult for IT departments to hold their stance for long.
Does your IT department still hate the iPhone, or have they come around? How do they feel about Android?
Jaelithe is a freelance writer interested in all things tech. Jaelithe and her iPhone Irene live a very happy life together filled with technology and productivity. You can usually find Jaelithe writing about enterprise fraud prevention for Attachmate, and the ways that gadgets can enhance everyday life.