I have been a fan of Robert X. Cringely for a long time. Back in the day I used Bob's Nerds 2.0.1 PBS documentary to train new hires on the basics and history of the Internet. While I am not big enough of a fan to know what the X. is all about I must admit much of my writing style is a poor man's knock off of Mr. Cringely.
The chapter entitled "The Prophet" is worth the price of admission. The prophet is Steve Jobs and what Cringely writes is prophetic.
Steve Jobs sees the personal computer as his tool for changing the world. I know it sounds a lot like Bill Gates, but it's really very different. Gates see the personal computer as a tool for transferring every stray dollar, deutsche mark, and kopeckin in the world into his pocket. Gates doesn't really give a damn how people interact with computers as long as they pay up. Jobs gives a damn. He wants to tell the world how to compute, to set the style for computing.
Pretty spot on.
Some of the passages in the book are clearly timeless in the land of the geeks.
People who actually rely on computers in their work won't tolerate being more than one hardware generation behind the leading edge.
And this one could be ripped from many articles on impending IPOs today.
Companies don't go public to raise money; they go public to make real the wealth of their founders.
Accidential Empires is both an enjoyable and must read for anyone that cares about computing.
A long time ago in a place not too far away a bunch of geeks that I respect a great deal told me to make the switch to Mac as OS X was a Unix based operating system. I heeded their advice and have been generally pleased using Apple computers for about nine years running.
Flash forward. A few years ago over a bowl of oatmeal Jack announced that he wanted to go to Purdue and learn how to make video games. I think he was nine at the time. Did not seem like a bad career choice to me so I decided a little encouragement was appropriate.
Jack likes money. I offered to double his allowance if he would take over IT support for the Weatherby household. He agreed and I taught him the ropes on what was needed to troubleshoot the home network and keep the computers up to date and functioning properly. We made a little notebook on the steps involved and off he went. If there are home network issues he how troubleshoots them and every Saturday morning he runs software updates on the computers and once a month does heavy maintenance.
At the time we were a duel Mac/Windows household (and still are with the exception of Abby that needs a Windows laptop for accounting applications). So after about six months of this sysadmining I asked Jack, the nine year old, a simple question "what do you like better Mac or Windows?" His reply, "Mac." "Well why is that?" I inquired. His reply, "because it's easier."
And that, from the mouth of a nine year old, is why Apple's staggering quarter is really no surprise.
For about a year I have been working on a project called LiquidText. LiquidText is a multi-touch document manipulation application for active reading. That sounds a little boring for a technology that is very, very cool. Here's a quick demo.
LiquidText is the brainchild of Georgia Tech student Craig Tashman.
One of my favorite outings of the year is to walk across the courtyard from ATDC to the Technology Square Research Building for the GVU Showcase. Two floors chock full of cool geeky stuff. Stuff like augmented reality, brain-compter interfaces, tangible media,and wearable computing. there will be over 100 research demonstrations by Georgia Tech faculty and students. It's a free and fun glimpse into the future.
So yesterday, according to The New York Times, Apple took about 600k in new iPhone orders. Rock on. At about $200 a pop that is about $120 million in revenue plus whatever the AT&T sub happens to be. Most likely three x that or $360 mill in total. Not bad for a day when you are unable to complete customer orders.
A big yippee ki-yay for Steve and his dedicated team from Cupertino. Atlanta based Gerry Purdy chimed in “It shows the Apple magic is still present, it’s impressive.”
As a heavy Apple user and someone that intends to get a new iPhone my response is "perhaps." The money quote from the NYT article:
Still, analysts said, Apple is struggling to maintain the same clear-cut lead over rivals that it had in the past. In particular, the growing portfolio of Android-powered phones, which number in the dozens this year and are offered by many companies, is a significant threat.“The reality is that in the long term, the Android market share is going to catch up to Apple,” said Charles Golvin, a wireless analyst at Forrester Research. “When you have one device being sold to a smaller portion of the population, it’s not going to compete as well as many devices from many vendors on multiple carriers.”
We've seen this movie before. Apple. Microsoft. The personal computer early market share war. We all know how that ended. Apple niche. Microsoft dominance.
The end game for the mobile OS wars has been foreshadowed.
He is writing about the interesting things that happen when
technology innovation and business execution impact each other. How
technologists can get things done in a technology company and what
business managers need to know to be successful. When Worlds Collide is about how organizations can succeed when
business execution and technology innovation seem to be on a collision
course. The "hey, you got your peanut butter on my chocolate" result.
Rich has an impressive background. He was in charge of the rebirth of undergraduate education in computer science at Georgia Tech, Chief Technology Officer
at Hewlett-Packard, VP of Computer Science
Research at Bellcore, and Director of
Computing Research at the National Science Foundation. I had the chance to meet him at a football game a few years back. He is nice guy. And much easier to talk to than his background might suggest.
Last week the first sentence of an article in the InformationWeek periodical specifically targeted at IT employees of the U.S. Government read as follows:
‘The General Services Administration has issued a Request For Quotation for cloud storage, Web hosting, and virtual machine services.’
This dry and seemingly innocent statement is in reality a blockbuster, a headliner worthy of amazement possibly and further investigation surely. Any computer industry veteran with federal government dealings will tell you the phrase ‘U.S. government technology innovation’ is an oxymoron (with the notable exceptions of the DOD and NASA). And now - low and behold! – the stodgiest of the stodgy is rapidly moving (that’s correct – rapidly) past all but the most innovative organizations in the world into the era of cloud computing! I’ll throw in a few cloud basics in a minute but just for those who aren’t ‘in on’ the cloud computing debate many still question whether cloud computing is a transformational advance in computing as many others claim or just a horribly overblown catch phrase meaning little. The InformationWeek statement is a really big nail in the coffin of the catch phrase camp. It’s not the first and it won’t be the last.
Why the sudden change after years of being exactly the slow moving inefficient bureaucracy all small government proponents complain about. Surely even the victory of a candidate who campaigned on upgrading the nation’s technology infrastructure could not have caused such a swift change in culture in 7 months. Maybe this technology innovation is too compelling to ignore even for this historically laggard organization. Maybe both. Neither seems likely but here we have it.
Let’s look at a few basic definitions to make sure everybody is on the same page. To follow the theme I will take this information from the National Institute of Standards and Technology (NIST):
Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.
This cloud model promotes availability and is composed of five essential characteristics, three delivery models, and four deployment models.
Ubiquitous network access
Location independent resource pooling
Cloud Software as a Service (SaaS) - Use provider’s applications over a network
Cloud Platform as a Service (PaaS) - Deploy customer-created applications to a cloud
Cloud Infrastructure as a Service (IaaS) - Rent processing, storage, network capacity, and other fundamental computing resources
Private cloud - enterprise owned or leased
Community cloud - shared infrastructure for specific community
Public cloud - Sold to the public, mega-scale infrastructure
Hybrid cloud - Composition of two or more clouds
Private enterprise is still trying to achieve a collective understanding of cloud computing and its value if any. Many continue to argue over the viability of clouds for any but a few unique applications (massive data, map/reduce, etc.), claiming obstacles ranging from security to audit ability to performance. After investigating these issues for many months, NIST has moved past these arguments and published a very mature definition of cloud computing, and an even more mature document on how to efficiently and securely implement a cloud computing environment. The document examines each advantage and each challenge presented by cloud computing. NIST sees the huge advantages many others see but they have also taken a hard look at the challenges and they view them as difficult but solvable. NIST is even helping drive some of the standards in area such as security.
In another fascinating development a fellow by the name of Vivek Kundra was tapped by Obama as the Nation’s first Chief Information Officer. Before taking the federal CIO role Kundra was the CIO for the city of Washington, D. C. The punch line? In less than two years Kundra moved the city to Google Apps replacing Microsoft Office software with Google’s SaaS offering. The city now posts their procurement process on You Tube. Kundra launched the ‘Apps For Democracy’ WEB 2.0 style collaboration contest in hopes of giving citizens a portal into such government information as crime reports and pothole repair schedules . Kundra expected maybe 10 apps and got 47 in 30 days. This endeavor saved the city ~$2.6M even after the $50K prize money was paid. Kundra works by the mantra that citizens are "co-creators rather than subjects." This 34 year old Gen-Xer is now in charge of information technology for the whole government! Looks like this may no longer be your grandfather’s federal government – at least in IT.
All of the primary cloud vendors have stepped up their efforts to sell this latest technology to the government since they too see the sleeping giant awakening. Companies like Amazon and Google who had done little business with the government in the past are now pushing the adoption of their cloud based technologies in all parts of the government. SalesForce.com, possibly the most successful SaaS, has also sold its software to several branches of the government. The DOD and NASA have both adopted elements of cloud computing. NASA uses SalesForce.com and the open source cloud software Eucalyptus. DOD is very actively promoting cloud computing internally.
The most innovative companies in the world today have provided the missing pieces needed to launch cloud computing as the NEXT BIG THING. Even the traditionally sloth like U. S. government is on board. To be sure, many of the truly transformational changes cloud computing will drive won’t be possible until the various technologies collectively dubbed ‘cloud computing’ mature even further. Additional innovations (some already envisioned, some not) such as better security, improved virtualization management and real cloud interoperability are needed to speed cloud adoption rates. Nonetheless, cloud computing is a transformational advance in technology on the level of the worldwide web or PC’s – or even bigger. If you don’t agree and you are a CIO you should look for a career ‘Plan B’.
This is a guest post by Russell Jurney, a technologist and serial entrepreneur. His new startup, Cloud Stenography, will launch later this year. The article is an extension of a simple question on Twitter asking the importance of Map Reduce. Some subjects take much more than 140 characters.
The Technical Situation in Brief
The advent of the personal computer and the Visicalc
spreadsheet were the foundation for a revolution in computing, business
and life whereby normal people could carry out sophisticated
accounting, analysis and forecasting to inform their decisions to
arrive at more positive outcomes. As Moore’s law
has progressed and processors have become faster, and computers
inter-networked, large volumes of highly granular data have been
collected. Analysis of terabyte datasets on the same level as a
spreadsheet has been limited by the disparity of acceleration between
processor speed and computer I/O (input/output) operations. Intel has
produced ever faster processor clock speeds without accompanying disk,
RAM or bus speeds. Put simply: We have cheap and numerous computing
resources and abundant data, but bringing those resources to bear on
that data to generate real value from it has proven exceedingly
The widespread use of relational databases
to access data in pre-defined static relationships has also limited our
ability to discover and infer new and unique relationships among data.
Dynamic analysis of large volumes of data in relational databases
requires exhaustive pre-calculation of indexes and summaries of data
for each relationship, and scaling relational databases to handle large
datasets is a complex, painful and expensive process. As a result
business intelligence systems relying on relational databases are
prohibitively complex and expensive. Other methods of raw parallel computation, such as MPI,
were exceedingly difficult. Such ‘smart kid only’ technologies have
significant barriers of entry for mere mortals. In fact,
multi-threaded, shared-memory computation in languages like C++ are
considered some of the most difficult, arcane areas of computer
science, leading to entire languages aimed at making concurrency easier.
MapReduce As the Way Forward
In order to extract value from large piles of data, we must escape
the bounds of IO by going parallel and having many processors work on
the data at once, without grinding our development to a halt dealing
with complex algorithms and frameworks. MapReduce and platforms
that implement it satisfy this requirement for a surprisingly broad set
of problems. MapReduce is a simple way to process data in parallel
among many commodity machines. You are already familiar with the power of MapReduce in your daily use of it - it is the pattern pioneered by Google to bring you the effective search on which we now all depend.
MapReduce is the design pattern
that in combination with recent developments in cloud computing and
cheap, plentiful broadband will bring us spreadsheet-style analysis of
vast amounts of data ill suited to traditional database management
systems in both scale and structure. MapReduce offers a cost-effective
way for any business to harness massive amounts of computational power
in the cloud for short periods of time to perform complex computations
on large volumes of data that would be prohibitively expensive and time
consuming on an individual machine, or that would require the
construction of a data center to handle.
The Business Impact
What does this mean for your business? Knowledge of MapReduce has
spread beyond Google, and it is now used by an increasing number of
companies to extract value from web-scale data. Facebook, Yahoo, Cloudera and many others have embraced MapReduce in the form of Apache Hadoop, the platform around which most open discussion of MapReduce has occurred. As a result, a new generation of startups
is rising that will take advantage of MapReduce to bring the same power
that google pioneered on search to bear on a variety of datasets. New
opportunities exist by ‘thinking big’ and extracting value from
ever-increasing streams and volumes of data.
Example 1: Proving Global Warming
What does this really mean? It means that developers will have a
clear way to reduce vast datasets to scales they can work with to
extract information to inform your decisions. In this example from Cloudera, Hadoop and Pig are used to query a 138GB log of weather history for the last 100 years from the National Climatic Data Center to reduce that vast data to a scale the developer is comfortable working with. The result is this chart:
As a pile of data, the NCDC log informs nothing. When queried via map/reduce using Hadoop and Pig,
we arrive at an informative chart that shows us an important trend.
Would that chart inform a discussion about global warming? If you could
get such clear visualizations about every minutiae of your business
critical to your success, would it inform your decisions? Can you log
and mine more data to streamline your operations?
We are constrained in our strategies by what we imagine possible.
MapReduce and cloud computing open broad possibilities and business
opportunities by placing a usable supercomputer by the hour in the
hands of every startup that wants one. There is no problem which you
lack the processing power to solve, its just a question of whether the
hourly cost is profitable. That's a profound change from being bound to
one machine. As a result of this shift, smaller companies can attack
'bigger' problems without a large up-front investment in hardware or
A new renaissance in computing is coming that will be comparable to
the business adoption of the personal computer and VisiCalc, and
MapReduce will drive it.
Force of Good is licensed under a Creative Commons License. You are free to share, remix, and share alike with attribution.
The opinions expressed here are mine and mine alone (with the exception of comments by others of course). They do not represent the opinion or position of any other person or entity. All postings adhere to my personal values.