This article from Scientific American analyzes the creativity of iPhone from touch screen technology, lithium battery and Internet technology.

In recent years, great Man theory has made a comeback to pop culture, repositioning entrepreneurs, tech startups, and digital conglomerates: Elon Musk revolutionized the electric car, Mark Zuckerberg pioneered social networking, and Steve Jobs and his team at Apple invented the iPhone.

These heroic narratives are neither factually correct nor helpful.

From an educational perspective, there’s a whole generation that grew up on inspirational YouTube videos that promoted individualism and some disquieting leadership traits. Yet the challenges the world faces — the energy crisis, food shortages, climate change, overpopulation — require the cooperation of all of us. These challenges are too complex, interconnected, and changing for any individual, idea, organization, or nation to solve. We need to take advantage of the fundamental principle that underpins all research — standing on the shoulders of giants, each new breakthrough building on the work of those who came before. The secret story of the iPhone is proof of that.

The hard work and innovation of Apple’s many teams cannot be questioned. But the iPhone would not have been possible without hundreds of previous research breakthroughs and innovations. Each of those studies is the result of layers of different innovations by countless researchers, universities, funders, governments and private companies.

To prove it, we took a closer look at three research breakthroughs underpinning the iPhone.

Touch screenCopy the code

The first touch screen was actually invented in the 1960s by Eric Arthur Johnson, a radar engineer working at a British government research centre.

In 1965, Johnson published his article “Touch Display — A Novel Computer input/output Device” in an electronic Letter published by the Society of Engineering and Technology, which is still cited by researchers today. His 1969 patent was later cited in many notable inventions, including Apple’s 1997 patent for a “portable computer handheld mobile phone.”

Since Johnson’s first leap, touchscreen technology has received billions of dollars in investment from public institutions and private investors, with one often begetting the other. The University of Cambridge, for example, recently set up a limited company to secure more funding for its research into touchscreen technology, which ended up with $5.5 million in funding backed by British and Chinese venture capitalists.

An Apple patent for touch-screen technology cites more than 200 peer-reviewed scientific papers published by academic societies, commercial publishers and university presses. The authors did not work alone; most were members of research teams, and many received a grant for their work. During the peer review process, at least one outside scholar independently evaluated their work. The peer review process is at the heart of academic research. Take, for example, a recent article on touchscreen technology published by Elsevier Information Sciences, which involved six authors and two anonymous reviewers.

We conservatively extrapolated the 200 articles cited by Apple to more than 1,000 researchers, each of whom had made significant contributions to touchscreen technology.

Johnson may have taken a first step, and Apple has taken full advantage of its potential, but we owe touchscreen technology to the combined efforts of many researchers around the world.

The lithium batteryCopy the code

British scientist Stanley Whittingham first invented the lithium-ion battery in the 1970s while working for ExxonMobil laboratories, where he was advancing previous research with colleagues at Stanford University. Their previous work had shown that lithium could be used to store energy, but Stanley Whittingham and his team figured out a way to do it at room temperature without the risk of an explosion.

Oxford University professor John Goodenough then improved Whittingham’s lithium-ion battery by using metal oxides to improve performance. That piqued the interest of SONY, which commercialised lithium batteries in the 1990s and launched a mobile phone using them in Japan in 1991. All this provided the basis for the widespread use of lithium-ion batteries that Apple launched in 2007 with the iPhone, which has more than a million users.

The lithium-ion battery story doesn’t stop there.

Murata, one of Apple’s main suppliers, bought SONY’s lithium-ion battery business in 2016.

Now 95, John Goodenough continues his pioneering research. Just a few months ago, he published a landmark study in the American Chemical Society: Goodenough developed a lithium-ion battery for electric cars that is used 23 times more frequently than the current average.

The Internet and the global NetworkCopy the code

Tim Berners-Lee is widely credited with inventing the World Wide Web. In the 1980s he worked at the European Organisation for Nuclear Research (EONR, better known in France by the acronym CERN). CERN was established in 1952 by 12 European governments and is still funded by its member states.

Berners-lee’s ideas began as a solution to a very specific problem at CERN: how best to facilitate the sharing and updating of the vast amounts of information and data used by CERN researchers. His proposal is based on the concept of hypertext. Hypertext was first described by Ted Nelson in a 1965 paper published by the Association for Computing Machinery. In contrast to electronic versions of footnote systems used by researchers around the world, hypertext support networks enable you to jump from one information source to another. It can appear anywhere on the Internet, in any form.

If the World Wide Web is a map, the Internet is our world: a network infrastructure that connects millions of computers around the globe, enabling each to communicate with each other and pass vast amounts of information.

For the origin of the Internet, we have to go back to 1965.

While Nelson was inventing hypertext and Eric was inventing touch screens, Thomas Merrill and Lawrence Roberts, two researchers at the Massachusetts Institute of Technology, were using a simple, low-speed dial-up phone line to connect their computers 3,000 miles to California. Soon after, Arpanet appeared, not as a dystopian AI system, but as an ADVANCED Research Projects Agency network. Founded and funded by the Defense Advanced Research Projects Agency (DARPA), Arpanet was originally conceived to connect the US military’s computers to regional centers.

And Arpanet was a precursor to the Internet.

True innovation is never plain sailing. But these early breakthroughs of the space age were the foundation for everything that followed. While today’s iPhone is 120 million times more powerful than the computer that sent Apollo 11 to the moon, its real power lies in its ability to tap into the billions of websites and terabytes of data that make up the Internet.

A brief analysis of the three groundbreaking studies shows that there have been more than 400,000 research publications since Apple first issued a phone patent in 1997. And we’ve barely scratched the surface.

There are countless other studies without which the iPhone would not exist. Some are well known, some not so well known. GPS and Siri both originated in the US military, while the complex algorithms that make digitisation possible were originally used to detect nuclear tests. They all have research at their core.

The iPhone is a technology of The Times. The technologies that define eras come not from the brilliance of one person or organization, but from layer upon layer of innovation and decade after decade of research. Thousands of individuals and organizations stand on each other’s shoulders and look to the future. In our time of seemingly insurmountable global challenges, we must not only remember this, but be inspired by it.

We must encourage openness and transparency in research centres and ensure that research is disseminated as widely, quickly and clearly as possible. Integrity and reproducibility of research, transparent peer review, open access, diversity — these are not just buzzwords, they are exciting steps in reforming the infrastructure of the global research ecosystem, and they have been our best hope for the future.

[For more information on artificial intelligence, please pay attention to the wechat public account “Rebuild_ai”]