Select Page

We are surrounded by digital technology in our modern world. From phones, to computers, to even the woven cloth we wear as cloths, most things can be simplified down to discrete symbols/building blocks (like 1s and 0s in code or “over/under” for woven things). However, one new digital technology is taking our society by storm: AI. Some background on the nature and mechanisms of artificial intelligence can be found on some of the other posts in this blog. In this post, however, I will be discussing people’s preconceived ideas about AI in the STEM field.

As people know, AI is becoming more and more capable of replacing people in their jobs such as manual laborers, clerks, factory workers, and various writing positions, just to name a few. It wouldn’t be an unreasonable assumption that it will take over EVERY job we have and to be scared. However, this isn’t really the case (yet…). One good example of jobs that can’t be replaced by AI yet are those relating to the field of STEM. In short, AI can’t replace more complex jobs like those, but instead help accelerate the speed of work.

 In my case, I am a biochemistry major. One line of work that correlated with my major is working with genes and DNA. In the human genome, there are millions of genes that interact with one other is unimaginably complex ways to help our bodies function, make us look a certain way, and make us who we are. Trying to manual determine what each of these individual genes do is quite literally impossible when thinking about it from a time-conservative perspective. However, with AI, this process can be exponentially sped up, with AI being able to retrieve data about genes in seconds, linking genes functions autonomously from a library of genes, and simulating what they could theoretically do without have to carry out long experiments. As of right now though, AI cannot take into account for genes not in its database or what experiments to do on new genes, so the ‘field of exploration’ mainly still is in the hands of humans. 

One thing to take away from this is that you should not be scared of AI. Instead, you should embrace it and learn how to use it as its here to stay. It will only do damage to your position if you let it; you must harness it to accelerate your progress and expand your capabilities beyond what your time frame allows you to. 

 

 

The Old Telephone vs. the iPhone 

Brief History:

The concept of fast communication across long distances started out as the telegraph in 1838. This invention was able to send instant messages across wired in the form of “buzzing” patterns which people commonly call Morse Code. It wouldn’t be until 1876 that Alexander Graham Bell invented the first telephone. The operation of the telephone was initially pretty crude, as operators would have to manually connect lines. Over the course of about 100 years, calls would begin to be relayed by radio signal via microwave towers, and then followed by electric routing (digitization). This electronic switch paved the way for digital data and communication transmission.

The Old Telephone 

– First model was invented in 1876 by Alexander Graham Bell

– Analog phones persisted up until the 1980s

– Large and heavy

– Could only send voice signals over a wired network

– Low quality and relatively high noise

– The overall mechanism consisted of converting sound (voice) to an analog electrical signal that traveled through a copper wire, back to sound on the receiving end

     – Voice transmitted continuously as a wave

     – While traveling across copper wire, the signal was susceptible to noise, distance loss, and interference

 

 

 

 

 

The Modern iPhone

– First model was made in 2007 by Steve Jobs after the creation of 3G, the first mobile internet

– Has a variety of uses other than just communication via calls/voices

– Instead of wires, iPhone use wireless networks and digital signals with the help of cell towers, Wi-Fi, and satellites

– In digital signals, voices are recorded, encoded, compressed, and then sent wirelessly to a receiver that unpacks and decodes the message

     – Avoids noise and interference 

 

 

 

Brief Background 

Since the industrialization of human kind, there have been a plethora of advancements in technology. This can largely be attributed to digitalization. Essentially, this means converting an analog signals (continuous, gradient-like) to a digital one (discrete, think if binary code). There are many benefits that these discrete signals bring, some of which include the removal of noise and interference, packaged data for smaller storage, being able to be computed and interpreted by computers, and sent long distances instantaneously. 

 

Here is a list of the many analog creations humankind has brought about, along with their digital counterparts. 

 

 

Analog 

  • Needle meters
  • Hand clocks/watches
  • Vinyl record player
  • Analog radio (AM/FM, dials)
  • Film camera 
  • VHS tapes
  • Landline phone 
  • Telegraph 
  • Mercury thermometer 

Digital

  • LED meters
  • LED clocks/watches
  • Streaming apps (Spotify, Apple Music, etc…)
  • Digital radio (internet, HD)
  • Digital camera
  • Streaming platforms
  • iPhone
  • Text messaging
  • Digital thermometer

As many of the posts on this page address, digitization of human technology has brought about significant advancements in what we are capable of. Two major parts of this process are analog to digital (A/D) and digital to analog (D/A) converters.

Normally, analog signals are not able to be used and computed by digital hardware/systems. However, A/D and D/A converters join the worlds of analog and digital so that devices can utilize and translate both digital and analog signals. Like their names imply, A/D converters turn analog signals into digital data, and D/A converters turn digital data into analog signals. 

ADC’s:

Most of the technology that we use today contain digital systems. However, the signals they receive are analog (sound, light, temperature, pressure, movement, etc…). Digital systems are unable to process nor computer them without turning those signals into digital data first, which is where ADC’s come in. ADC’s work by measuring the analog value at regular intervals (samples per second AKA Hertz). Then, these values are encoded into binary code (0s and 1s) based on bit depth, at which point the data is now digital and can be worked with by digital systems. 

DAC’s:

Once the digital data has been process or computed, it will need to be converted back to an analog signal so that us humans can see the resulting information, which can be accomplished by DAC’s. DAC’s translate each number of the code into a corresponding voltage or current, Then, the voltage or current is run through a filter to smooth it into a clean analog signal that can be perceived by humans. 

 

Here is my weaved initials!

Basic Guide on How to Weave:

  1. Determine the layout of your design; which squares are white/black or under/over
  2. Lay out the loom and tape the top part of the strings to a surface, like a table
  3. Obtain weft (horizontal strips) and being weaving them over or under the loom strings in accordance to the desired design
  4. Once everything is woven, line everything up and secure the ends of the woven design to hold everything in place
  5. Show it off to your friends! The more practice you do, the more complex designs you can make!

What are neural networks? What do they have to do with AI?

In general terms, a neural network is a system of algorithms that tries to mimic how humans learn. It’s made up of layers of nodes (also called neurons), which are connected to each other and pass information forward through the network. These nodes or neurons can be divided into 3 layers: the input nodes, the hidden layers, and the output nodes.

The input nodes are responsible for taking in data.

The hidden layers are where the data is processed. Each node applies a mathematical equation to the data that is received and passes it on. In general, the more nodes and connections that are present, the more efficient learning that will be done.

The output nodes produce the final interpretations or results of the data, and shown to human eyes.

Neural Networks and Machine Learning vs. the Brain

Neural Networks
– Basic units are nodes
– Nodes are relatively basic; performs simple weighted sums
– Transmitted signals are numerical values
– picoseconds (FAST!!!)
Billions of node connections
– Learn through:
– gradient ascent
– brute force
– supervised learning (human-generated data







The Human Brain
– Basic units are neurons
– Neurons are complex with thousands of synapses
– Transmitted signals are biochemical electrical signals
– microseconds
10^14-10^15 synapses
– Learn through:
– assumptions and models
– different types of memory collection
– Although neuron signals travel much slower than NN signals, the brain is able to learn much quicker (for right now) than NN’s, as we don’t have to look at something 1,000 times to finally learn something. This is due to the thousands of synaptic connections that human brains have.

Digital vs. Analog 

About

Digital and analog

Throughout history, the conversion from analog to digital has helped us advance in technology in many ways. For example, converting technologies to digital has helped us make measurement devices more accurate by converting continuous signals that can produce noise to discrete signals. These discrete signals help give more clear outputs.

Digital

What does it mean to be digital?

When something is digital, it means that there are discrete values that can be used to create nearly infinite combinations or outputs. Essentially, a finite number of symbols is arranged in different ways to create infinite responses.

Examples of Digital

1. Binary code 

2. Punch cards 

3. Neurons

4. Words

5. Hieroglyphics

 

Analog

What does it mean to be analog?

Analog refers to something that is more like a scale or a gradient as opposed to discrete values that make up a response.

Examples of analog 

 1. Radios

2.  Meters with needles

3.  Coordinate functions

4. Old telephone

Which Is Better? Digital or Analog?

Most people may think of analog as “old-school” or outdated technology, however that couldn’t be further from the truth. Although digitization of technology has no doubt brought along significant advances, it does come with tradeoffs that people need to weight out before choosing whether to make something digital or analog. 

Digital

Pros

  • Less affected by interference/removes most noise
  • Easy to store and duplicate data
  • Able to be processed by computers
  • Can have precise, repeatable data

Cons

  • Needs to be converted back to analog for humans to comprehend (DAC)
  • If bit size is too low/not enough computing power, will have low resolution

Analog

Pros 

  • Continuous data/no discrete values (gradient-like) 
  • High resolution 
  • Can function with less/no computing power

Cons

  • Difficult to duplicate and store
  • More noise/interference occurs
  • Cannot be processed by computers (need ADC)

Introduction

It hard to believe how far humans have come in the context of technology. What started out as stones, sticks, and fire, laid the foundation for unimaginable technological growth that has resulted in what we are capable of using today: computers that can store nearly endless amounts of information, phones that allows us to communicate with people across seas in a matter of seconds, and artifical intelligence that far surpasses the computing power of the human mind. All of these things wouldn’t have been possible without the developement of digitalization. In order to truly understand the origins of all the technology we have, we must first look at the history of ‘going digital’. For more examples and definitions of analog and digital technologies, visit the “Digital vs. Analog) post.

The 1800s

At this point, the technologies that have been invented were limited and are seen now as impractical and inefficient. However, they laid the foundation for what we have today. Although it may come as a surprise to some people, a variety of these early technologies utilize digital concepts. 

  • The loom – a machine that used the “over and under” concept of weaving. The options “over” and “under” can be related to 0s and 1s of binary code
  • Telegraph – although used analog signals, the messages that were sent used discrete characters to convey messages, which is an example of a digital technology
  • Telephone – continuous, analog voice transmission

1900-1970

The early 1900s analog inventions consisted primarily of honing the transmission, collections, and interpretations of radio signals/waves. From the 1950s-1970s, digital technologies started becoming more practical, but were not outright replacing analog counterparts yet.

  • Radio – utilized analog radio signals and transmission
  • Television (analog) – Uses analog/continuous electrical signals to display video and audio. These signals are susceptible to noise and interference, which gave rise to static. 
  • Computers – Digital computers become commercialized
  • Other digital inventions – Watches, calculators, microcomputers/microprocessors become widely available to customers

 

1980-1990

The latter part of the 1900s experienced not only the highest point of analog technology, but also it’s decline that was followed by the increase in digital technologies. 

  • Cellular Networks – allowed for the quick transmission of packaged digital signals over long distances without noise or interference
  • Television (digital) – Instead of using analog electrical signals, these TVs used packaged, encoded signals that were better for long distance transmission and noise avoidance. 

 

2000-present 

  • Digital Phones – Upgraded technology permitted the use and transmission of digital information that can be used for more than calling, such as texting, surfing the internet, playing games, etc…
  • AI – Artificial intelligence has been around for many years, but has only recently gotten to a point where it was really be used to advance human knowledge and capabilities. AI functions via neural networks that consist of nodes, which can be related to neurons in the brain. Each node transmits and computes digital information, and with more nodes and layers of nodes has produced the advance AI that we have recently been seeing today.