The intelligence of machines and the branch of computer science which aims to create it

Artificial Intelligence Journal

Subscribe to Artificial Intelligence Journal: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get Artificial Intelligence Journal: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn

Artificial Intelligence Authors: Liz McMillan, Yeshim Deniz, Ruxit Blog, Kevin Benedict, Rene Buest

Related Topics: Artificial Intelligence Journal, Java Developer Magazine, Marketing and Sales, DevOps Journal, mobilityajaybudhraja, cloudbigdataajaybudhraja

Blog Feed Post

A Short History of Programming – Part 2 | @DevOpsSummit #Java #AI #ML #DevOps

Change blindness describes how normal people don’t notice massive, obvious changes in their environment

Code Compiled: A Short History of Programming - Part 2
By Omed Habib

This is the story of software. The initial blog in this series was all about the structural formation of programming languages. We went all the way back to steampunk days to see how the framework for programming grew out of Charles Babbage’s Analytical Engine in the 1840s. We ended up with a list of the most active programming languages in use at the moment. Now we’ll take the next logical step to examine what programming has done for enterprises and SMBs. We’ll also trace the effects of shockwaves in the world of databases, communications, and mobility.

Technological Change Blindness
There’s a strange phenomenon known as change blindness that describes how normal people don’t notice massive, obvious changes in their environment. It can emerge from gradual shifts or very rapid transformations that are interrupted by a distraction. For example, a study by Cornell found that test subjects didn’t notice when a researcher, posing as a lost tourist, was replaced by someone else who looked completely different midway through the questioning.

Change blindness is happening right now on a societal level when you reflect on what programmable software has accomplished. Consider how radically our world has been transformed over the past two decades, partially due to hardware upgrades, but mostly due to programming.

In the last decade alone, we’ve seen society rebuilt due to the popularity of:

For anyone too young to have seen it or too busy to remember, here’s a recap of how business records and communications operated in the pre-software era.

Life Before Software
How many times per day do you use your computer? That question really doesn’t make sense for most workers today because they never stop using their computers. This goes beyond developers to every single person in the organization. Every time you check the time, write a note, or make a call you probably did it on the web or using a mobile device. Here are just a few of the jobs that didn’t exist in the recent past:

10 years ago
Global total app developers = roughly 0.
There were the basics of social media, but no social media managers. There were no departments devoted to cloud engineering. Big data analysis was primarily academic. Development and operations didn’t become DevOps until 2009. Even the title “web developer” didn’t get a Bureau of Labor Statistics (BLS) designation until 2010.

20 years ago
There was no such thing as an online marketer. PPC didn’t exist before 1996, and the first keyword auction kicked off in 1998. In 1995, there were only 16 million internet users on the entire planet. Wireless engineers were battery specialists, because the 802.11 WiFi protocol came out in 1997 and widespread adoption would take another decade.

40 years ago
The late 1970s introduced personal computers to the business world, and the modern digital world as we know it can be traced back to that moment. Before that, computers were room-sized monsters like the IBM S/360. In 1976, there were no Apple computers, no Tandy TRS-80s, no Commodore 64s, and no Texas Instruments 99/4s — and IBM PCs were many years away. If you were a programmer, you might be working in UNIX, Pascal, COBOL, C, or Prolog and carrying around a suitcase full of punch cards. You might have a job switching reels of giant magnetic tapes that computers used as memory. There was no such thing as a reboot and crashes were common. You might spend the day pulling up floor tiles and looking for twisted cables. Perhaps the most astonishing fact about this picture is that some of the people you work with right now probably remember those days.

When Windows Were Only Glass
Before computers, offices tended to be loud and smoke-filled. Typewriters rattled everywhere and you could tell who was at work by the cigarette smoke curling above the desk.

Customer data, billing, legal documents, and other important records were made of paper and stored in boxes. The boxes were usually kept in a giant file room that had to be kept updated daily. Security was often non-existent and a disaster like a fire could wipe out a business in minutes. Contacts were often kept on paper rolodex files and everyone had their own.

With the arrival of personal computers, software fundamentally changed all business processes, making them repeatable, transferable, and vastly more productive.

The Database That Changed the World
You can spend endless hours arguing about which software has had the biggest impact on history, but every story has to start with 1974’s Relational Database Management System (RDBMS). There was no systematic way for storing and accessing data from the time electronic computers took off in the 1940s until the early 1970s. To find and retrieve information, you had to know where it was stored and how the program worked that did the data storage.

When IBM’s Ted Codd published his twelve rules for relational databases, it became the universal model for storing and structuring data. DB2 and its many children, like Percona and MariaDB, still underpin the global web. This led directly to Structured Query Language (SQL), Oracle, and the database wars of the 1980s. Today, software that has to manage the sheer volume and velocity of big data requires non-relational databases, but even these have their origins in Codd’s matrix.

The Grid and Cloud-Based Software
The history and impact of the internet are too large a subject to be discussed here, but cloud-based software is its latest expression. Software as a Service (SaaS) grew out of “The Grid,” a concept by Ian Foster and Carl Kesselman in the early 1990s, at the same time as the birth of the World Wide Web.

They imagined that software should be a metered utility, like electricity, where people just plugged into a grid of resources. Doing that depended on the development of effective cluster management and data residency. Clustered and networked computers used the rapidly developing internet protocols to fetch, process, and deliver data.

That meant that you had plenty of CPU capacity, but the actual machine doing the operations could be thousands of miles away. The connectivity speed of the communications channels hadn’t caught up to the network, generating delays in fetch and execution commands. Bottlenecks in I & O were common and cloud-based software started to gain a reputation for unreliability.

In terms of cloud security, the earliest threats are still the strongest: data breaches from malicious actors, data leakage from developer errors, identity blurring from insecure credentials, and APIs from untrusted sources. Today, whole industries are entirely reliant on cloud-based deployments despite the ongoing security challenges. SaaS was soon joined by Platform as a Service (PaaS) and Infrastructure as a Service (IaaS). The mobile workforce revolution would not have been possible without it.

Mobile Software for Working Remotely
Over the past 20 years, telecommuting has gone from a dream to a necessity. A Gallup poll showed that over a third (37 percent) of U.S. workers telecommute some of the time, compared with single digits before 1996. Of those who do telecommute, one in four work remotely more than ten days every month. In terms of effectiveness, 74 percent of those surveyed said that telecommuters are just as productive or much more productive than their co-workers.

The mobile workforce revolution is tied closely to the development of BYOD (“bring your own device”) and “workshifting,” which is the process of moving work to non-traditional times and locations. The three software trends that made this possible were the business app ecosystem, tighter security management tools for remote logins, and data center control panels that could handle all that network traffic. Put them together and the traditional office starts to look more like an unnecessary capital expense whose main function is serve as a backdrop for press conferences. The IDC now projects that 72 percent of the US workforce will be remote workers by 2020.

Industries Without Supply Chains
Arguably, the area that has seen the most dramatic changes due to recent software advances has been the finance industry. Finance has no logistics and no production supply chain to worry about. Information about money is what they sell and companies differentiate themselves on how well they manage that information. That’s why the expansion of internet access to more people and robust data analysis has meant so much to the industry. Unlike other information-driven industries, finance concerns every single individual alive today and each entity — whether it is a person or corporation — can have unlimited accounts.

The financial industry has been rocked by more disruptions than any other in terms of software created by SMBs as compared to other large enterprises. It has seen the introduction of new business models like crowdfunding, new forms of online currency like Bitcoin, data integrity disruptions like Blockchain, and new concepts in transactions like peer-to-peer lending.

We’ll go much deeper into these issues for the third and final blog in this series. We’ll look back at how programming changed banks and insurance companies with databases in the 1960s, then follow that through to the latest big data analytics driving capital markets today. You’ll see how programming and software advances have affected all business concerns, from precision marketing to risk management.

Learn More
In case you missed it, read about ‘Code Compiled: A Short History of Programming – Part I.’ Stay tuned for ‘Code Compiled: A Short History of Programming – Part III.’

The post Code Compiled: A Short History of Programming – Part II appeared first on Application Performance Monitoring Blog | AppDynamics.

Read the original blog entry...

More Stories By Jyoti Bansal

In high-production environments where release cycles are measured in hours or minutes — not days or weeks — there's little room for mistakes and no room for confusion. Everyone has to understand what's happening, in real time, and have the means to do whatever is necessary to keep applications up and running optimally.

DevOps is a high-stakes world, but done well, it delivers the agility and performance to significantly impact business competitiveness.