Monday, January 30, 2012

Beyond Passive Consumption of Technology

Technology surrounds us. We use it every day without thinking.

Yet, rarely do we stop and think about its effect on us.

Today I consider how I view technology and how I perceive it being viewed by others.

Technology as a Tool

This reflection was spurred by a friend asking the following question:

For you, what does it mean to be an active technologist who started in the Web 1.0 era?

Here's my answer to that question.

Today one could argue that everyone is an active technologist - that is, a person who uses technology as part of their every day lives.

technology enhances our daily lives

Some of us develop new technology, but all of us without thought enhance our daily lives with technology.

  • Need to know the hours of a bistro you visited years ago but don't know the exact name of? Chances are you "Google" for the information on your smart phone. No more dialing 411 and hoping the operator connects you with the right bistro.
  • Want to share the news that you're pregnant? In place of handwritten letters, you may tweet, share on Facebook, or post to your blog.
  • If you pick up a phone to call with your news and no one answers on the other end, you leave a message.


early technology use influences how we view it in our lives

While almost all of us use technology daily, our relationship with it and how we view it differ greatly. I think one's early encounters with technology influence whether we develop new technology, adapt it with purpose, or just use it without thinking.

for context, my introduction to technology

In case you're curious, I'm a GenXer, born in 1969. I started using and adapting technology before the Web 1.0 era.

I don't have a CS degree, and I've never had Software Engineer in my title although I had Engineer in my title for almost six years. I have a Bachelor of Science in Civil Engineering, a Bachelor of Arts in Communication, and an MBA with emphases in Supply Chain Management and Marketing.

In all my courses of study, technology and software were routinely used to complete tasks and explore hypotheses. I was exposed to computer programming through BASIC when my dad brought home an Atari 400 in 1979. In college, I learned ADA and Pascal. At work, I developed user interfaces to R:Base, dBase, and later Microsoft Access databases. I went on to test and document programs written in Scheme, C, C++, and Java.

what-you-see-is-what-you-get (WYSIWYG) replaces fast failing and rapid iteration

To me, those who experienced technology in the Web 1.0 era take it for granted.

For example, they'll never know the wonder of WYSIWYG - it's the default today.

When I first started designing, direct mailers and brochures didn't begin to take shape until you printed your first draft. All you saw on the screen was a bunch of code and coordinates. Revisions didn't take minutes. First, you had to schedule time on a shared IBM. Then, you had to carefully input your design. Cross your fingers everything is right, and print. Log off, and walk down the hall to the one shared office printer. What you'd see was a dot matrix approximation of your artwork. If it was close to your vision, you'd break your design into individual C M Y K layers and send it off to a print shop. They'd generate a proof and you'd find out if you got your masks correct. More often than not, something would be wrong.

In college, I got my own computer - a Mac Plus. It didn't matter that the display was black and white, it opened up new worlds with Superdraw and Word. I now had WYSIWYG capabilities, a variety of font choices, and the ability to illustrate class notes using a mouse to draw a line across my screen instead of plotting coordinates. My notes were transformed into bound, indexed references. These books would land me my first job at a technology firm.

Today in a couple of hours I can fully execute a design concept. Grab props for a photo shoot, take a few snaps, transfer the images from the digital camera to my MacBook Pro, open them up in Adobe Photoshop, adjust blacks and midtones, drop them into Adobe InDesign, select my fonts, and print. When I'm happy with my brochure I can print it at my desk or send it to a print shop.

Unlike the past, I don't have to worry whether or not portions of my design will be lost due to incorrectly set up masks. Today's printers have digital presses that take full color images.

hacking reserved for those with formal CS degrees or titles of Software Engineer

From the pre-Web 1.0 era to now, the demographics and attitude of active technologists have changed - and not in a good way in my opinion.

"Hacking" or adapting technology to fit your needs has become less commonplace in the corporate arena. Here I'm purposely excluding technology firms or software teams from this observation and pulling from personal experiences I had working at Real Estate offices, Construction firms, Print Shops, and Design Agencies.

Today in an office environment, "hacking" seems to be reserved for those with formal CS degrees or titles of Software Engineer. When I first started working, companies outside of civil/structural engineering firms, if lucky, had programmable typewriters. As computers came into offices there wasn't any expectation of what you were supposed to use them for.

Computers were tools with unlimited potential. If you could dream it, you could make it happen. I created animated films using Aldus Persuasion, a predecessor to Microsoft PowerPoint. I used Frame Technology's FrameMaker Markup Language (MML) to craft contextual help that executed commands as you walked through the steps in a tutorial. (This was before Microsoft introduced Clippy.) MML wasn't designed to be used to create in-product help, but there wasn't anyone to say you need to buy an online help product such as Blue Sky Software's RoboHELP. No one said the software couldn't be used that way.

Computers were the great leveler. Anyone from an administrative assistant to a manager could write a software program which could end up being rolled out across the organization.

a lack of ingenuity or "figure-it-out-ability"

Nowadays computers and software programs come with expectations of what they're used for.

There's less experimentation. It seems today that if you don't have software designed for a specific task, no one looks at other tools to see if they could be leveraged; it can't be done rings out instead.

For me, being an active technologist who started pre-Web 1.0 means that I have a different relationship with technology than those who started in the Web 1.0 era or more recently in the Web 2.0 era. Where they may see technology with defined edges, I expect to be able to take whatever technology I come across and use it to make my dreams reality.

I see technology as malleable, something that has evolved and will continue to evolve in my hands and the hands of others.

And, I wish more people outside of CS, who grew up with technology everywhere, viewed it as approachable, as a tool that with their creativity and direction has unlimited possibilities and can be transformed.

your turn, what are the characteristics of an active technologist? are you one?

So, now I ask you the following question:

For you, what does it mean to be an active technologist who started in the pre-Web 1.0 era? In the Web 1.0 era? Or, in the Web 2.0 era?
genuinely eden

Credits: All layouts designed by and images taken by Eden Hensley Silverstein for The Road to the Good Life.