Understanding Innovation
J. Concordia 9/01/2019Innovation; What it means and what it takes
Preface:
Three years ago I wrote a document that reviewed some predictions on what the future would bring in terms of technological change. I called it "The Next Big Future" [see supplemental readings]. In certain fields of technology, particularly computers, three years is a long time and many new products can be expected to arrive in that time span. This is also true in communications and IT, which are for the most part subsets, or really just applications, of the broad field of computer technology. In this document I will revisit some of the sources for the earlier writing to see what changes have been introduced in the past three years. However the principle focus of this document will be to discuss what the drivers are for innovation and the skillsets innovators use for their creative accomplishments.Introduction
Websters defines innovation as "the act or process of introducing new ideas, devices, or methods". Modern society has coopted the term and places it very much in a context of business enterprise and as that, focuses it very much on operating principles in today's commercial businesses. This is discussed in Forbes published blogs authored by Michelle Greenwald. [2]. In her March 24, 2014 blog Michelle Greenwald gives a much broader interpretation to the meaning of innovation. She applies it to a whole range of human activity, however she does concentrate on how the term applies in the business world. She has developed a business, "Inventours.com" [3], built on the institutionalization of innovation as a key path to success in business. She writes: "Innovation can be described in different ways, through…
  
Words: Original, unexpected, fresh, never been thought of before, never been seen before, creative, new, useful
  
Reactions: “Isn’t that clever?”, “What a great idea!”, and “Why didn’t anyone ever think of doing it before?”
  
Descriptions: Challenging conventional notions of how things have been done before, and bringing ideas from one industry to another, or from one geographic region to another.
Strategic Criteria such as:
  
Creating meaningful points of difference for products and services vs. current alternatives."
While Ms Greenwald is spot on with some of those words, it may seem that her belief is innovation is something virtualy "invented" by the likes of Apple and Microsoft or their counterparts of the past 20 to 30 years. Innovation has much deeper roots and much greater significance in worldly sociological evolution. Innovation has been a recurring process from the origins of mankind. When the first hairy man left the cave and built a home for his mate and children he was innovating. He found a way to protect his family from the wild beasts that was more secure, comfortable, and available to him in an exclusive way. Likewise when he fashioned pieces of wood into arrows or spears he found a way to get food without engaging the wild beasts in hand to hand battle with a club or a boulder. He could now hunt them down from a distance. Likewise the steps of change from a band of hunter gatherers to a world comprised of communities with socially structured civilized populations were all innovations. Moreover, it is my belief that those innovations had even more significance for society than the modern technological marvels of today. They made it possible for the innovators of today to do their things. Indeed, all innovation builds on the knowledge, inventiveness, and successes of prior innovators.
All innovation is an incremental advance from some prior art. The major differences are in the size and scope of the incremental advancement and the elapsed time to bring the advancement to fruition. All innovators stand on the shoulders of someone that did something earlier to make it possible for this particular innovation to succeed at this time. This is not to say innovations are trivial accomplishments. Real innovation entails much skill, intelligence, and persistent dedication to find a better way. However, proclaiming every person that comes up with a better way for making and selling something, "a genius" may be overstating things.
Approach to Study
An understanding of innovation is best achieved by viewing innovation from a historical perspective. It is best developed by looking back in time at some current examples rather than building forward from some assumed beginning. It would take a very big effort to do this for a lot of our modern items so I've selected a short lists to study. The lists include current products, business strategies, and social practices that characterize modern society. All are popularly identified and widely recognized as elements of modern society. In some a particular person is viewed as the primary innovator associated with the item. These individuals are frequently applauded as being a genius.
Some of the most widely adopted items and their popular innovators, or items that have become newsworthy through anticipation that the item is impending adoption and expected to have dramatic effects on society, are given in the following table:
World Reknown Items and Recognized Innovators | ||||
---|---|---|---|---|
The Innovation | Top Producer | Innovator | Earliest Prototype | Demonstration Date |
The Smart Phone | Apple | Steve Jobs | Martin Cooper, Motorola | April 3, 1973 |
Social Media | Mark Zuckerberg | Andrew Weinreich, Six Degrees | 1997 | |
Computer Software | Microsoft | Bill Gates | Cell 4 | Cell 5 |
Autonomous Car | Tesla | Elon Musk | Cell 4 | Cell 5 |
Animated Movies | Disney Studios | Walt Disney | Cell 4 | Cell 5 |
On-line Merchant | Amazon | Jeffrey Bezos | Cell 4 | Cell 5 |
In the sections that follow each of the above listed items will be reviewed to identify its originating events and the drivers leading to its world position today. The objective is to test the thesis presented by this document. Namely, innovations are not singular products of one person (or even a small team of workers) nor are they fully produced in a set time of accomplishment. They are the result of a succesion of events, activities, and joint efforts in which a key prior item established a framework on which the current item could be successfully produced. Innovations, to actually come to fruition as commercially successful, require the combined efforts of large contributing groups. While a creative person may guide the process, his skill for that may be based more on people leadership talent rather than depth in the specific technical expertise needed to produce the item. Major innovations, to succeed as commercial products, will always require very large financial investment. While a seed may be planted via an individual working in an amateurish setting, the full accomplishment cannot be declared exclusively his individual work product. In essence the defacto innovator is a system, coordinated and working interactively, to solve the technical issues along with a financial support structure comprising investment sources willing to risk the money needed to keep the project alive.
The reviews here are possible only by availability of the Internet, itself clearly modern society's most significant innovation. Information is contained in a plethora of websites presenting the history, technical insights, personal biographies, along with data on scale of adoption and financial impacts for all of the items listed above. The only requirement for accessing this information is to make the appropriate query to the Internet. The objects listed here for study are each of such scope that they would justify a complete doctorial program and thesis to do it justice, for some even a large number of doctorial theses. The reviews here can only be cursary, but hopefully of sufficient depth to qualify the purpose of this writing. Thankfully the internet, in its modern version, has developed many sites that are agregates for classes of information. Through such sites certain efficiences are possible since a single search can provide the specific links to several objects of study. In the early days of the Internet, and its precursers arpanet and usenet, my searches required much more effort.
Some sites that provide aggregated information useful for innovation study include:
- Wikipedia: wikipedia.org/ While wikipedia is not thought of strictly as an aggregator, the hyperlinked keywords in each article, and the substantial source listings in many articles provides a path to much related information. It is a powerful aggregator
- Product Evolution: product evolution.org A site specifically dedicated to presenting histories of important innovations.
- MIT Technology Review: technologyreview.com/ Not so much on history, but commentary on what is happening now.
- Encyclopedia: encyclopedia.com A good source for background information.
- Vision Launch: inventors An indexed listing of inventions with their inventors.
Reviews
The Smart Phone
Firstly it should be noted that the Smart Phone is not a phone. It is a hand held computer in which one of the applications it is capable of running is to connect to cell communications networks so that it can be used like a cell phone. However, as we know, it is much much more than that. The product name "Smart Phone" is a misnomer, that in itself is testimony to the fact that this innovation is derived from the cell phone, an earlier innovation. "Smart Phone" defines a class of products and is not a term for a specific innovation. There are a number of branded products produced by different companies and each offering specific capabillties[4]. This document will focus mainly on evolutionary innovations, how they influenced mobile communications, and eventually led to the Apple iPhone.Evolution to the smart phone innovation builds on many prior innovations. Without going back to the ancient forms for communication, like tom-toms and smoke signals, or the earliest electrical devices like Samual Morse's telegraph and Bell's telephone, it is enough to start with the introduction of mobile communication devices.
Stage one: Initially this was short wave radio. Short wave radio was demonstrated by Gugliomello Marconi in 1923[5], introducing the innovation of voice communications with a wireless electronic device. The successful solution for this problem was the discovery of radio waves in 1887, and the development of practical radiotelegraphy transmitters and receivers by about 1899. The idea of mobile communication by radio was developed over the years to portable, efficient [8], and even low cost devices like the "walkie talkie" which included even a popular kid's toy in the 1950's. The key innovator for this was Motorola and continues even today as an important vendor of this type of mobile communication products.
Stage two: The limitation of line-of-sight, limited range for radio signals and allocation of the frequeny spectrum, demanded a system need and led to the concept of cellular networks. The first days of radio employed large high powered antenas to give range to a broadcast station. With development of mobile 2-way radios, concepts for cellular communications technology took root to make radio communication widely available. The idea that a large geographic area could be served using many small coverage areas had been put forward by Bell Laboratories (AT&T). Cells were proposed in the 1940s; a first, limited capability installed in St Louis as AT&T had proposed a “broadband urban mobile system,”. However it was not until the mid-1980s that radio technology and systems were widely deployed using cells.[6] The first real cellular system in the U.S. was put into operation by the Bell System in Chicago, in 1983. Ironically, cell systems were in place earlier in other places, Japan in 1979 and England before 1980. Regulatory in-fighting delayed USA projects. Cells are geographical areas where a radio signal can be received and rebroadcast using transciever technology. Cells expand mobility for communication devices. The US innovator for this was AT&T. See a review of this development [9]
Stage three: With cells, cell phones became possible. The first handheld cellular phone call was made on April 3, 1973 by Motorola engineer Martin Cooper, from Sixth Avenue in New York City walking between 53rd and 54th streets.[8] On September 21, 1983, Motorola made history when the FCC approved the DynaTAC 8000X phone, the world's first commercial portable cell phone. Going forward from that time Motorola produced a whole series of improved phones. Also a number of other companies entered the field adding their particular design concepts and features to their phones. These included firms such as Nokia, Samsung, Ericcson, Bell/IBM, Seimens, and others [10]. These "feature" phones continued to incorporate more and more of the capabilities and usability characteristics of what today are considered smart phones. Each adapted their designs to advances in cell communication technologies G2, G3, etc.,incorporated computer technologies, and added accessory capabilities, such as graphic displays, an internet browser, even the first camera phone the Sanyo SCP-5300 in 2002. Each of these products advanced the state of the art well before the launch of the Apple IPhone. Apple did participate in this stage of early development when Apple CEO Sculley spun off a group of talented employees as a separate company, which occurred in May 1990. In 1990 Marc Porat, Andy Hertzfeld, and Bill Atkinson in Mountain View, California founded it. Apple took a minority stake in the company, with John Sculley joining the General Magic board. The company was known as Digital Magic [12] and in the early 1990's, before the days of browsers and the internet, produced a number of products with features that anticipated the type of product eventually produced as the iPhone. The company achieved many technical breakthroughs, including software modems (eliminating the need for modem chips), small touchscreens and touchscreen controller ASICs, highly integrated systems-on-a-chip designs for its partners' devices, rich multimedia email, networked games, streaming television, and early versions of e-commerce. However it ceased operations on September 18, 2002 as sales dropped significantly.
Stage four:In 2007 Apple released the original iPhone. It incorporated an auto-rotate sensor, a multi-touch sensor that allowed multiple inputs while ignoring minor touches, a touch interface that replaced the traditional QWERTY keyboards, and many other features. Apple captured a healthy market share almost instantly on its release. This was followed by the iPhone 3G in 2008. The iPhone 3G was made even more desirable by allowing a whole range of apps that could be purchased for it in the AppStore. Smart phone utilization has grown to a level where 80% of the people in the USA are users of at least one phone. Many claim they cannot do without their phones. Smart phones are distinguished by their large memory, built in operating systems, and usability features. Apple phones are distinguished by their prodigous availablity of apps and high performance for running them. In summary it is clear that the smart phone is the work product of many innovators over a long period of time. Apple's participation in this has been substantial, but not the predominate technological contribution to the art. Apple's strategy in marketing and its promotion of the smart phone concept as a modern tool not only as a phone, but as a hand held portable computer is probably as much the cause of its success as any specific technical details incorporated into the iPhone as Apple innovations. Declaring smart phones as a Steve Jobs innovation, individualy and exclusively, is clearly not supported by the facts.
Social Media
There is a concept in the world of social relationships that says that all people are six, or fewer, social connections away from each other. As a result, a chain of "a friend of a friend" statements can be made to connect any two people in a maximum of six steps. It was originally set out by Frigyes Karinthy in 1929 and popularized in an eponymous 1990 play written by John Guare. It is sometimes generalized to the average social distance being logarithmic in the size of the population[1]. Karinthy believed that the modern world was 'shrinking' due to this ever-increasing connectedness of human beings. He posited that despite great physical distances between the globe's individuals, the growing density of human networks made the actual social distance far smaller. This is encapsulated in the expression: "What a small world!" a remark often heard when we meet a stranger and learn that he is actually a good friend to someone we know as a friend.This reality of nature, which existed for many years, combined with the internet as a system that facilitates communication with strangers anywhere in the world prompted Andrew Weinreich to launch a website called "Six Degrees" in 1997. It allowed its users to create a profile and then befriend other users, known or not known to them. Andrew Weinreich is a social networking pioneer [2]. He is founder of 7 tech startups, including Six Degrees, the world’s first social network. Andrew’s knack for uncovering the next big trends in technology and then building successful companies has established his unique position within the tech startup world. Clearly he is intelligent, talented, and a successful business person. But is he an innovator? Should he be awarded the title of "genius", as some public figures have proclaimed others who have accomplished similar things in other product areas.?
That question is a vital one in this discourse on innovation. It begs the question; What are substantive meanings for the phrase "to innovate" and 'innovation' as an object? As I begin this review of social media, the Webster's definition given in the introduction above seems to me highly inadequate in several ways. There are dimensions to the act of innovating and the innovations created in that process that go beyond the terms given in that definition. True innovations are highly unique things that deliver world changing, beneficial results. They are products, systems, or ideas that have long lasting qualities. They are things that change the path of progress in life's struggles to overcome disparity and depravation. In my opinion Social Media falls short of this. It is certainly an important, popular, and wide ranging practice with significant impacts on society. However it is questionable whether the impacts have been beneficial or detrimental. Weinreich's Six Degrees lasted from 1997 to 2001, four years. A wikipedia template gives a listing of all the social media sites that were active or inactive in 2012. It lists 88 active sites and 53 defunct sites. This in itself indicates a transitory and indiscrimate nature for the social media industry as a whole. The great success of the major firms such as; Facebook, Instagram, Pinterest, LinkedIn, MySpace and some others is probably more based on business sector acumen and promotional skills rather than unique innovative abilities with digital technology.
A stepwise progression in the Social Media case is not as well defined as in the case of the Smart Phone. There certainly has been a number of instances where a social media website was launched going back to the late 1990's, and there certainly has been a series of different niche applications for the social media sites they were launched. However, the evolutionary development in a technical sense has been rather flat. The main development has been in its scale not the degree of sophistication in the product. It is acceptable to think of Six Degrees as stage one and Facebook as stage two or three, without delving deeply into the various sites launched in the intervening times.
Facebook, Inc. is an American online social media and social networking service company based in Menlo Park, California. It was founded by Mark Zuckerberg, along with fellow Harvard College students and roommates February 4, 2004. It is considered one of the Big Four technology companies along with Amazon, Apple, and Google[4]. The term "Social Media" is used in several contexts. On the one hand in a sociological context. Firstly; this is based on its use by people, largely younger generation, as a personal communication system via the internet. Secondly in its other one; a business context focused on the scale of its reach in society, its financial performance, and individual firm competiveness within an industry. Facebook is the dominant player in the industry [5] but in recent times is losing "market share" in the industry. Market share in this context means share of an advertising market. Social media service companies derive income largely from advertising. They are not physical product developers or manufacturers, not even software products. They do not provide tangible services, for example like financial service companies provide. They simply make websites available to users through their computers and data centers, sometimes for a subscription fee but mostly with no specific charges. They run ads and other promotional items on their systems which are broadcast to users whenever they log in and appear in the displays largely with no discretion by the user except the option to click, or not click, on the ad to connect with the advertiser.
The main technological content and challenges inherent in social media service companies is in the operational aspects of computer servers and data centers. While this is not trivial, it is not something demanding break-through inventive genius. Suitable computer hardware is available although in some cases the social media engineers must be knowledgeable on state of the art and build systems that have superior capability and efficiency. Software methods are available but the programmers must know how to apply them properly to execute the procedures needed. Facebook, due to the scale of its operations has the most complex challenges with these things[6]. The other challenge faced by these firms comes from the nature of the service they provide. These firms deal with information and uses of information by individuals and organizations. That information may be highly sensitive, either as personal information the user deems private or content that may be placed in circulation by the user for purposes that may be; inappropriate for distribution, harmful to other users, or even an illegal act. The solutions to these challenges are not technological, they are embodied in appropriate business management. Facebook, and Mark Zuckerberg, has been under fire recently for perhaps not dealing with this adequately. This type of thing has caused some people to minimize their use of Facebook or even close their subscription to the service.
In summary it is clear that the level of innovation in social media is not great. Facebook is a successful business but not a great innovation. I would consider Mark Zuckerberg; while an entrepreneur, highly capable and smart person, I would not consider him to be a true innovator.
Computer Software
As a practical matter it is not really possible to review computer software in complete absence of computer hardware so this review will overlap the two. However, the main focus here will be on how programming computer applications has evolved. Computer technology is largely driven by software needs and developments. Users want to run applications and as more sophisticated applications are identified and attempts made to run them new hardware with greater speed, capacity, memory systems, etc. are developed. The two go hand in hand so that description is not always the sequence. It is like the chicken or the egg question; "which comes first?". Its a hard question to answer. One way to answer this would be to say that in the early stages of computer evolution it was largely hardware first, but in the current environment it is software demands as the first driver. Current hardware is approaching the limits of the dimensions of a silicon atom while software applications are still open to vast new ideas that creative programmers can envision. Earliest hardware began with vacuum tubes, an essentially obsolete technology now. Programming, once the capability for stored programs became available, has always been an intellectual process. This will always remain so, even when programs themselves are computer generated, so there is really no limit to software evolution. It should also be stated that this review is limited to digital technology. Analog systems are a different thing. Also discussion is limited to the principle operating systems in wide use and a few essential categories of application software. A total review of all available software would fill many bookshelves. As a general view, the overall field of computer technology is so vast it is possible to treat only a very small slice of it in this paper
The history on evolution of computers has been very widely documented in books, papers, and web sites. Perhaps it is the most documented of all the innovations ever implemented. This is easilly understood since the computer is singularly the most significant innovation man has ever concieved and implemented. It has penetrated every segment of society. It has clearly changed the whole world in such a way that our times can be defined as comprising two eras, the "pre-digital" and the "digital" ages. As to even a full bibliography on computers, that itself would be a book and way beyond the scope of this document.
Michael Mahoney of Princeton wrote a paper [1] that gives a comprehensive analysis and perspective on this topic with much information on the evolution of computers, and related technolgies. It goes far beyond a simple timeline for the industries. His view on these things tends to support a number of my comments in this document and fundamental thesis. Particularly that innovations, and the innovators, are always built on accomplishments from prior work. Also, the earlier work was done and only came to be understood as a critical element much later in the evolution of computer technology.
Following listing gives some interesting and useful items presenting this history. They are listed here with links to the publication.
- The Timeline of Computer History:Computers
- The Timeline of Computer History:Software
- Book Titles: MIT Press, History of Computer Series
- Infographic History: computersciencezone.org
- A History:alanturing.net
- Innovator Biographies: computerhope.com
- The bases of Computers: history-computer.com
In overview, the cycle of software evolution began when a computer was built that could store in memory and run a program written in a language convertable to machine language. The first large "Turing" computer, the ENIAC, was built at the University of Pennsylvania's Moore School of Electrical Engineering and operated in 1945. It was not "programmable" as we understand the term today. It did not have a storage unit for memory. The machine was "coded" to run a particular process by making settings in hardware, essentially setting physical switches in allignments as needed to execute a run [Hardware 1]. That view for a computer is so primative users today might imagine it impossible. The following paragraphs will trace this evolution.
Stage one: A computer known as the Small-Scale Experimental Machine (SSEM), was the world's first stored-program computer. It was built at the Victoria University of Manchester, England, and ran its first program on 21 June 1948. It was essentially a test bed to evaluate a memory system known as the "Williams–Kilburn Tube" named after inventors Freddie Williams and Tom Kilburn [Hardware 2], a volatile memory unit using cathode ray technology. It had a 32-bit word length and a memory of 32 words (equivalent to 128 bytes). With success of the SSEM the Wilson Tube was applied in a number of early computers with enhanced byte capability. It was used in the Ferranti Mark 1, the world's first commercially available general-purpose computer. It was even used in the UNIVAC I.[Hardware 4]
At this stage of technology; software, as we understand the term today, as such did not exist. Computers ran "code listings" to execute procedures wherby the internal electronic signals needed to operate the computer could run an application.
Computers understand "machine language" which is a list of digital numbers (strings of zeros and ones listed in some sequence) where each string represents a setting for the state of some component in the system. Early developers wrote a primitive "assembly language" (autocodes) listing in english language characters as code that could be interpreted as machine language. The listings included processor commands and application commands, there were no "operating systems". Later autocodes had "compilers" that assembled the listing. Assembly language statements are highly cryptic, not resembling spoken languages. Running, since the main memory was volatile, code was not permanent and had to be loaded for each run. The code was stored on a tape (paper or magnetic). Later code files were created on punched cards which could be processed by a reader into tape media from which the computer could read the code. "Software" was essentally a group of paper cards with holes in them stacked in proper order to run a program.
Stage two: In the mid 1950's sequential developments in memory technology started to move away from the unreliable early designs. This came in stages introducing various magnetic media systems[Hardware 3]. By the mid 1960's these provided highly reliable, higher capacity memory, and very high capacity storage systems all of which in turn facilitated use of larger and more complex computer operations. Languages were developed to circumvent writing in assembly language.
Programs could now be written in a spoken language and converted to machine language within the system. These were called "high level languages". Such languages used "compilers", another levelof software, to include standardized routines into an assembly code which runs as one integrated machine language program in that computer. All computers have their own specific machine code which is dependent on their "architecture" (design of their internal components). In the early days assembly level programs included statements that configured the computer to run the particular program in a particular machine. By the mid 1960's, "operating systems" were developed that configured computer operations so that programs could run on a variety of computers of a particlular architecture. Evolution of this is summarized in a table [Software 3]. Also, input and output hardware was developing during this time. This led eventually to systems where the "user interface" was at the level of a keyboard and monitor screen instead of a job order desk at a computer processing center. This allowed clever programmers to write code for more complex procedures, run it and get instant results.
Stage three: The path to the first widely used and powerful language for scienific work still in use today, FORTRAN, ran from the initial conceptualization in 1954 by James Backus to a first implementation in 1957 FORTRAN I then fully featured FORTRAN IV released in 1962. In that interval there were a number of other programming languages conceived and implemented [Software 2] including several languages developed by Grace Hopper that provided the foundations for COBOL. FORTRAN and COBOL were the most common every day languages used throughout industry in the 1970's and early 80's. That was a time before the introduction of time sharing, networks, client-server architecture, and PC's. FORTRAN was the standard tool for science and engineering. COBOL was the standard tool for accounting and business information systems. "Computer" meant some sort of main frame computer. High level languages contributed to the rapid expansion of computer usage in industry and ultimately the development of operating system software as a way to make computer hardware utilization more efficient.
In earlier days use of computers was based on job runs where programmers also phsically set up the run and operated hardware that coprised the main frame computer. This involved a lot of time consuming activity that detracted from programming efforts and also tied up expensive computer hardware with non-productive activity. Computer coding can be viewed as having two parts, application code and processing code. Both things were combined as needed to make a run. Processing code, the part that manages the hardware controls, can be broken out and standardized for use on the hardware to support many applications. This frees up programmers to program. With the hardware set-up demands managed by a standardized processing code, skilled operators could run the machines while programmers programmed. This led to the development of "operating systems" software [Software 4]. The first such system was designed at General Motors Research Labs in 1956 for the IBM 704. Development progressed through a number of systems at IBM eventually to an advanced and highly functional operating system in 1965, the IBM OS/360. It was the prototype for DOS/360, which for many years was the world's most widely used operating system.
Stage four: This stage embodied splitting the concept of computers into two very different worlds, (1) continuation of the "main frame" such as it was, (2) introduction of the computer as a tool for personal use. This review will focus mainly on the second path.
Beginning in the mid 1970's and continuing through the 1980's there was a broad based movement toward development of computers that were smaller, lower cost, simpler to operate, useful tools for individuals or small business needs. A number of firms participated in this including companies such as Commodore, Wang, Data General, Atari, Apple, and others. [Hardware 5] They were seeking to establish a market niche separate from the IBM large computer users. Beginning even earlier and with a much greater impact Digital Equipment Company ("DEC") invented and developed the "mini-computer" product. The roots of DEC go back to 1957 when Ken Olsen and Harlan Anderson while working at MIT's Lincoln Labs foresaw the need for a smaller computer than the big "main frames". [Hardware 6] DEC began selling its first computer at the end of 1960. There was still a stigma to the name "computer" at that time as a big, expensive, and complicated product, so they called their computer a “programmable data processor”, or PDP. Throughout the remainder of the decade, DEC created over a dozen PDP variants. One of my projects at Polaroid involved the design and construction of a plant for film manufacturing. We used a PDP 11 system as the main controller for the chemical processing part of that plant in 1970. DEC grew into the 2nd most important computer supplier behind IBM in the 70's and 80's. They made very important contributions to computer technology with higher processing speeds based on advanced chip designs. However DEC did not foresee the future direction of technology toward the PC and suffered a decline by 1990. They continued to make important contributions in computers, software, microprocessor chips, and even the internet itself with the AltaVista search engine. They eventually were bought out. Intel bought the microprcessor plant in 1997 and ultimately COMPAQ, the computer business by 1998. Hewlett Packard acquired COMPAQ in 2002.
A distinguishing characteristic of a certain class of computers was that they fit on top of a desk and provided an integrated system of processor, I/O, display and easy connection to a printer or other peripheral device. They were made possible very much due to the development of microprocessor chips by Intel, particularly the 8080 chip, and the concept of a "motherboard" which was first used by IBM as their "planar" circuit board. The motherboard brought all the electronics together in a compact structure with interchangeable components that could be easilly configured into systems with different capabilities. IBM made a historic decision to enter this field, which was a big departure from their business stance prior to that time. On August 12, 1981 they announced the introduction of the IBM PC, calling it the Personal Computer.[Hardware 7] In the next few years they established dominance in this market, releasing a series of improved systems eventually delivering the IBM XT286 in September of 1986 using the 80286 processor chip, and continuing to supply an installed Microsoft DOS operating system as standard on all their PC's.
IBM's business strategy for their PC departed from their long standing approach of keeping all elements of their business in-house. The IBM PC used third party hardware throughout, and they cooperated with suppliers on many details of their system design and their technology. This eventually led to a large number of firms entering the field as suppliers of "IBM Clones". Many of these were overseas businesses in low wage labor rate countries. This drove the price of a PC down dramatically and made PC's virtually a commodity product. Almost everyone could afford one. Soon it became commonplace for households to have a working computer at home, and even children playing games on computers. By the mid 1990's worldwide production and sales of PC's was about 65-million units, with IBM a leader in market share. While still having an important market share, IBM got out of the PC business in 2005, selling their business to Lenovo, a Chinese company. Lenovo continued it and grew it to become the world's largest PC business in 2015, much of it in the portable PC "laptop" business. IBM's decision to leave the PC business may have been the result of good planning, or simply fortuitous since the introduction of smartphones in 2007 marked a change in the growth curve for PC's. The smart phone displaced PC's as the preferred portable device, which is one more indicator that the smart phone should really be called a hand-held computer, not a phone.
Along with IBM's venture into PC's, Apple was born April 1, 1976 as a company dedicated solely to the concept of the small computer for individual use. In those early 1970's computers were sold as kits to enthusiasts that assembled them, adding their own components to the basic circuit board with key elements as furnished by a computer company like Commadore, Atari, Tandy, etc. or a semiconductor firm like TI. Wozniak was an enthusiast that wanted to build his own computer and Jobs was a friend. They shared computer interests. Apple entered the market with a first product called the Apple I. It was 1976, much earlier than IBM's entry and with a very different business philosophy and cultural make-up. It diverged from the typical kits of the time by offering what they called a "fully assembled" kit, which was still just something we would today call a motherboard with a processor. It did have the innovation of a keyboard interface which was a new idea compared with other company offerings.[Hardware 8]. This was followed up with the Apple II in and II Plus in, both real fully assembled desk top computers, then in .
Stage five: The path forward at this stage can be defined as a contest between Apple, on introduction of the Mac January 24, 1984 [Hardware 9] and Microsoft, on introduction of the Windows operating system November 20, 1985 [Software 6]. These developments established two camps for PC users with both sides highly entrenched and adamant in their particular preferences for one or the other. One of the features each brought to the users was the graphical user interface ("GUI"). This provided a usability enhancement to the PC that made it possible to run applications almost without any special time consuming learning or training exercises. So called, "out of the box" operability was the claim. Windows was deployable on allmost any branded computer, while the Apple OS was bundled with the Mac and only available on an Apple computer.
Apple OS and Windows OS are fundamentally different approaches to operating systems. Windows, like MS DOS Microsoft's earlier operating system, is a command instruction based architecture implemented in programming statements that do data readings and execute procedures. In this architecture Windows can be applied in various CPU designs. Also, the user has access to some information concerning the processes that are running. The Microsoft concept is one that requires the user to interact with the operating system through the mouse or keyboard to execute some procedures. Apple's OS has certain essential steps built into ROM chips not accessible to the user. The Apple concept is to have the system execute certain steps of programs directly through hardware. The system displays a result, but does not provide a way for the user to initiate it or prevent the initiation of that step. A fundamental concept of Apple's products is that they perform within a system driven framework so that the user does not need to perform operational tasks, such as file directory maintenance or program configuration. All procedures are launched by selecting an ikon on a graphical interface using touch screen technology. This has allowed Apple to control its marketing by forcing all users that want Apple software performance features they must buy Apple's hardware.
Autonomous Car
The term "autonomous" car must be differentiated from the "computer controlled" car, they are different things. There were several concepts for cars that drive themselves proposed around the time of the first cars ever made. Likewise, there have been several stages of cars with "cruise control" for quite a while [1, 2]. However, cars operated by computers is something even today is a technology only in its early stage of development, albeit well along toward becoming a practical reality [3].A good description of what the distinguishing features of a computer controlled car might include is given in a paper by John McCarthy written in 1996 at Stanford University [3]. The early concepts of autonomous cars were built on a variety of mechanical or electromechanical devices to relieve a driver from actually controlling the car, these things could take over. They made the vehicle autonomous to a degree. However, a real computer control car is something much more. McCarthy's description embodies the imagination of what that could be, at a time when computer technology still needed much further development to make such a thing a reality.
Animated Movies
On-line Merchant
Summation
Prologue
Prologue
The modern world is replete with innovative items of all kinds. Some are less widely applied than the above listing, but still are major developments of world importance. Identification of a single prominent inovator for these things is more obscure than in the above listed items. Probably so since it is more widely understood that these item evolved from diverse previous items that only morphed into the modern counterparts. Even a partial listing of such things would fill pages of this document. The following table includes a small selected list:
Selected Items and Innovators | |||||
---|---|---|---|---|---|
Item | First Producer | Innovator | First Market Item | Date of Entry | Ref. |
Nuclear Energy, | Westinghouse | Cell Three | Shippingport | 1957 | Ref:DOE   |   NC State |
Supersonic Aircraft | Republic Aircraft | John Stack | YF-100 Super Sabre | 1953 | Ref:Bell-SX1 History   |   Current Planning |
Credit Cards | Flatbush National Bank Brooklyn, NY | John Biggins | Charge-It Card | 1946 | Ref:The Future |
Designer Medicines | Zeneca | Corwin Herman Hansch | Elavil (amitryptyline) | 1961 | Ref:wikipedia   |   Review Article   |   Review Article |
Battery Power | Cell Two | Alexandre Volta | Link | Cell One | Ref:History |
Space Exploration | Cell Two | Cell Three | Link | Cell One | Ref:Link |
Others | Cell Two | Cell Three | Link | Cell One | Ref:Link |
Supplemental Readings
The Next Big Future: LinkIcons of Invention: Google Books
Sources
Item One: Smartphone
- List
- What Exactly is Innovation: Forbes   |   Other Blogs by Ms Greenwald
- Inventours: Link
- Smart phone brands: Link
- Short wave radio: Link   |   Radiotelegraphy
- Cellular communication: electronicsnotes.com
- The First Cell Phone Call: Link
- Motorola: Link   |   cell phone   |   two way radio
- A Brief History of Mobile Communication: Review Paper Presents many details on the evolution of cell technology.
- Cellphone Evolution: Link
- Smartphone Evolution: Link   |   Statistics
- History General Magic
Item Two: Social Media
- The Underlying Principle: Wikipedia
- Andrew Weinreich: About
- Social media sites: Wikipedia template
- Facebook: Wikipedia
- Industry: Statistics
- Facebook Sysem: Technology
Item Three: Computers
1. History of Computing (an overview taken at 1988 of this "new science") Michael S. Mahoney: Link   |   local file location: HerePart 1. Hardware (memory)
- Memory Evolution: File
- Memory: Williams Tube
- Advances in Memory: Wiki Overview   |   Non Volatile Memory
- UNIVAC I: Link   |   File System: Brochure
- Desktop Computers: 1975 - 1984   |   1985 - 1994   |   1995 - 2004
- Personal Computers: History IBM PC
- Personal Computers:: History Apple I   | nbsp Apple II and IIplus
- Mac Computers: history-computer.com
- Source: Link
- The Software Preservation Group, softwarepreservation.org
- Timeline of programming languages: wikipedia.org
- Software Evolution: File
- Operating Systems: softwarepreservation.org
- Operating Systems: IBM
- Windows: history-computer.com
- Source: Link
Item Four: Autonomous Car
- Historical overview: digitaltrends.com
- Industry overview: theverge.com
- History Update: digitaltrends.com
- Computer Controlled Cars: John McCarthy, Stanford
- Source: Link
- Source: Link
- Source: Link