Communication Design Studio: Project 2

John Baldridge
11 min readSep 29, 2020
Argo Self-Driving Car. Source: Carnegie Mellon University Advancement

WEEK 5 — CLASS 9

Tuesday, September 29, 2020

Project 2 introduction

For project 2, I was tasked to learn more about autonomous vehicles (or self-driving cars), which is a topic very close to home for someone living in Pittsburgh. In Pittsburgh, there are three major players working in the autonomous vehicles space; Carnegie Mellon University (General Motors), Uber’s Advanced Technologies Group, and Argo AI.

What is an autonomous vehicle?

Definitions are important. Given the novelty of the technology involved, let us first define “autonomous vehicles.” According to Technopedia, autonomous cars/vehicles can guide themselves without human conduction. Autonomous vehicles can guide themselves without human intervention. An autonomous car is also known as a driverless car, robot car, self-driving car, or autonomous vehicle.

According to the SAE International, there are six levels of driving automation, from no automation (human-driven) to full automation (no human intervention).

“Levels of Driving Automation” Standard for Self-Driving Vehicles. Source: https://www.sae.org/news/press-room/2018/12/sae-international-releases-updated-visual-chart-for-its-%E2%80%9Clevels-of-driving-automation%E2%80%9D-standard-for-self-driving-vehicles

Why is it important?

The idea of autonomous vehicles might sound new to us, but in reality, people have been working on this technology for quite a while.

Dean Pomerleau and Todd Jochem posed with NavLab 5 20 years ago before setting off on a trip to California. The minivan did almost all of the steering on the tour, which was called No Hands Across America. Source: https://www.cs.cmu.edu/news/look-ma-no-hands-cmu-vehicle-steered-itself-across-country-20-years-ago

In fact, researchers at Carnegie Mellon University developed an autonomous vehicle over 20 years ago that drove itself 2,800 of the 2,850 miles between Pittsburgh and San Diego (Bloomberg, 1995).

“Driving is the most complex activity that most adults engage in on a regular basis. Just because we do it doesn’t mean we can teach computers to easily do it. It will be many more years for full automation.” - Raj Rajkumar, George Westinghouse Professor, Electrical and Computer Engineering, Carnegie Mellon University

According to Argo AI, they define their business as developing the virtual driver system and high-definition maps designed for Ford’s self-driving vehicles. Automakers like General Motors, Toyota, and Ford are acquiring companies with AI expertise, just like Argo AI.

According to the Automotive Software and Electronics 2030 report conducted by McKinsey in 2019, in the next ten years, the automotive industry will a tremendous amount of change that has not been matched in over a century. McKinsey believes that this change will be driven primarily by four mutually reinforcing trends, i.e., autonomous, connected, electric, and shared (ACES) vehicles (McKinsey, 2019).

Automotive software and electronics 2030. Mapping the sector’s future landscape.

Companies like Uber see the future of mobility as “increasingly shared, sustainable, and automated,” in large part due to autonomous vehicles. If successful, autonomous vehicles have the potential to change our lives for the better in numerous ways.

For example, with the advent of autonomous vehicles, we could see safer roads, fewer cars, fewer emissions, and more equitable transportation systems. Think about the over 2 million Americans that are homebound, elderly shut-ins, or the physically disabled that are legally not allowed to drive. They could all benefit from a system that will give them back their independence.

How does it work?

WEEK 5 — CLASS 10

Thursday, October 1, 2020

In this class, we reviewed the Scott McCloud reading, “The Vocabulary of Comics” (Understanding Comics, Ch. 2). McCloud describes himself as either a “comics’ leading theorist” or a “deranged lunatic,” depending on who you ask. McCloud uses humor to provoke reactions throughout the comics community and — increasingly — beyond it.

Screen capture from “The Vocabulary of Comics” (Understanding Comics, Ch. 2, page 46)

Using the ideas of McCloud, we worked together to try and illustrate the phrase below:

Sneezing can’t really be controlled — it’s one of the body’s reflexes, and is typically associated with irritation in the nose. From here the signal is sent via neural pathways to the brain, resulting in a powerful release of air through your mouth and nose, which not only helps expel mucous or irritants from the nasal passages as fast as possible, but also contracts a bunch of muscles in the body, including the eyelids and the trachea.

WEEK 6 — CLASS 11

Tuesday, October 6, 2020

How do Autonomous vehicles work?

What is it?

Autonomous vehicles can guide themselves without human intervention. An autonomous car is also known as a driverless car, robot car, self-driving car, or autonomous vehicle.

Autonomous vehicles are thinking machines that react to the built world around them, just like us, kinda.

How does it work?

Instead of using their eyes, ears, sense of smell and touch, to receive information, autonomous vehicles make use of sensors, actuators, and cameras. They also use machine learning, algorithms, and powerful processors to replicate the functions of our own brains. Much like when a human driver might lose control of their vehicle during a rainstorm and then corrects and remembers for next time, autonomous vehicles store that important information and learn over time.

Unlike human drivers, autonomous vehicles have built-in GPS systems that allow them to maintain an accurate map of their surroundings. Radar (RAdio Detection And Ranging) systems built into the vehicle are used to monitor the position of nearby vehicles, pedestrians, and other objects. High definition video cameras are also used to detect and read road signs, traffic lights, and other visual cues along the journey.

In addition to radar, autonomous vehicles use Lidar (Light Detection And Ranging) sensors that are used to bounce pulses of light from the vehicle to objects in the surrounding area in order to measure distances and detect road edges. Visible Lidar systems can be seen on top of autonomous vehicles spinning around shooting invisible laser beams and catching reflections. The time it takes to return to the system creates a distance measurement that the vehicle can then compute and know how far easy an object is.

Lastly, autonomous vehicles have powerful software that can process all of the information in real-time, much like the human brain. The computer can now send instructions to the vehicle actuators that control steering, braking, and acceleration. Much like human drivers, the autonomous vehicle learns over time and makes good use of predictive modeling and learned experiences, making autonomous vehicles more effective over time.

Other opportunities for autonomous technology

Why is it important?

Companies like Uber see the future of mobility as “increasingly shared, sustainable, and automated,” in large part due to autonomous vehicles. If successful, autonomous vehicles have the potential to change our lives for the better in numerous ways.

For example, with the advent of autonomous vehicles, we could see safer roads, fewer cars, fewer emissions, and more equitable transportation systems. Think about the over 2 million Americans that are homebound, elderly shut-ins, or the physically disabled that are legally not allowed to drive. They could all benefit from a system that will give them back their independence.

WEEK 6 — CLASS 12

Tuesday, October 8, 2020

Before this class, I spent some time thinking about flow and storyboard. I specifically thought about the storyboard in terms of the following:

Visual: what information do I plan to communicate by using visuals.

Temporal: what information would be helpful to show using animation or motion.

Aural: what information would best be explained using sounds like narration/story, sound effects, and ambient noise.

I would like to have an establishing shot showing an autonomous vehicle navigating a city street with no help from a human.

Then I would like to have visuals. I will use is the “just show it” method described by Moyer (Napkin Sketching, pg 80). I will show an autonomous vehicle and then show the various components (cameras, lidar, radar, sensors, etc.) By describing the important parts that are unique to an autonomous vehicle, it will make this unfamiliar concept easier to grasp for the reader (Moyer, Napkin Sketching, pg 81).

Next, I will show both the front view and profile of the vehicle, labeling the technician components coupled with a brief description. I would then like to show how it works using the human brain metaphor coupled with the process flow example (Moyer, pages 94, and 108).

The human brain metaphor helps illustrate the similarities and differences between an autonomous vehicle and a human driver. A process diagram showing how the vehicle receives information and then processes it, much like the human brain, might be helpful.

According to Moyer, “What we understand about a known topic can accelerate our under­standing of a new topic.” People roughly understand how the human brain takes in various information using tools (eyes, ears, cognition) to make informed decisions.

Lastly, I would like to show a vignette over time. This might be a good way to show why this matters. I envision three vignettes 1.) autonomous vehicles can improve safety 2.) create more equitable mobility 3.) create a more sustainable future.

WEEK 7 — CLASS 13

Tuesday, October 13, 2020

Mood board

Key animations

Autonomous vehicles have the ability to recognize objects like stop signs and make decisions based on the instructions given using a sophisticated clustering algorithm.
The RADAR system detects the direction and speed of the autonomous vehicle. The RADAR system feeds the vehicle’s onboard computer system data on the current and developing state. The operation and safety depend on the accuracy of the sensor system that feeds the information to the computer.
In addition to the RADAR system, the LiDAR is kind of like the eyes of the autonomous vehicle. The system provides a 360-degree view of the surrounding area which allows the vehicle to see things like a car getting to close, and then make snap adjustments in realtime.
An autonomous car navigation system is based on Global Positioning System (GPS) and it enables the care to navigate roads and get to specific destinations

WEEK 7 — CLASS 14

Thursday, October 15, 2020

What did you learn from reviewing other’s work?

I learned how others are using visuals and narrative to explore their topics. I would like to revisit my narrative work to make sure I’m telling a compelling story that makes sense. I also learned that some parts would benefit from being illustrated/animated while others might work better as static or real video shots.

What did you learn from the review of your work?

I learned where I could improve the story and some of the metaphors I am using. For example, perhaps I could lean into the metaphor more? I also learned that maybe I should focus on the journey of the cars (like it is the main character) for the “how does it work section.” I learned that the animations I started are working well, but I might want to consider how they would work together as one fluid animation.

What are the next steps you plan to take? Why?

I plan to revisit my narrative and think of a better way to bring it all together. I also want to start working on the illustration style and color palette and how I might create a video style that compliments it without being to jarring. I also want to revisit my storyboard and make sure the flow works well with the updated narrative component.

Illustration style

I started to develop an illustration style in Adobe Illustrator. I wanted to maintain a balance between simple yet informative.

Color Palette Inspiration

Source: https://m.futurecar.com/4106/Silicon-Valley-Lidar-Startup-Luminar-to-Go-Public-in-a-Blank-Check-Deal-at-a-$2-9-Billion-Valuation

The color palette was informed by the visuals that an autonomous vehicle “sees” when scanning its surroundings with the LiDAR system.

Tie-ins to class readings

On page 50 of Meredith Davis’s book “Graphic Design Theory,” Davis says that the goal representation “is to engage the audience more deeply in reflection about concepts and to inform judgments about the significance and possible courses of action.”

WEEK 8 — CLASS 15

Thursday, October 20, 2020

I wanted to define and assign each of the different main technologies that the autonomous vehicle is using to navigate space. I created a separate icon and color scheme and assigned one color for each action. This will allow me to highlight each technology and show are they are different.

WEEK 9 — CLASS 15

Thursday, October 26, 2020

First draft

I was unhappy with the quality of the narration in DRAFT 001 given that we do not have access to the CMU studio to do the recording. Luckily, my friend Daniel, who is an engineer, science writer, and has done radio narration and segments for NPR WESA, kindly agreed to do the narration. Daniel not only had access to good equipment but was also eager to lend his voice to the project. Many thanks to Daniel!

I provided art direction regarding the tone and style. The tone of the video should be cool, techie, upbeat, and conversational. The style is meant to be casual yet informative.

Second draft

Dan Boyarsk visited our class to review the draft of our videos. Overall, Dan thought the video was working well and liked how some important keywords and phrases showed up in certain parts of the video. He suggested that I add a few more of those elements during the beginning and end to help make my key points.

After receiving feedback from Stacie, Dan, and the class, I have made many adjustments to the final video below.

First, I shot video clips of a driver to help fill in the gaps in the narrative when I talk about the driver getting tired, distracted, and impaired. At first, I thought this could be animated, but I eventually agreed that it would take away from the current, more important, animated parts. Using real video seemed more appropriate. I also found a video clip I shot in 2019 of an UBER Volvo driving through Oakland and a photo I took while visiting the UBER HQ in San Fransisco in 2018.

Secondly, I adjusted the pacing and alignment of key animations and display text. This helped the overall flow of the video. Below you will find the final video.

WEEK 10 — CLASS 20

Thursday, November 5, 2020

Final video

Takeaways

The biggest takeaway from this project was the importance of peer feedback. Every time I presented my draft to Stacie or a fellow classmate for critique, I received invaluable feedback that allowed me to take my project to the next level. I also appreciated that I started crafting the concepts early on in the project, this gave me the time I needed to tweak and refine the overall concept and to test whether or not it was working.

In future projects, I will continue to allow for enough time to receive feedback. I will also be keenly aware of the visual, temporal, and aural components of my project.

--

--

John Baldridge

Trying to leave the world a little better than I found it