• ## Early Computing: Crash Course Computer Science #1

Hello, world! Welcome to Crash Course Computer Science! So today, we’re going to take a look at computing’s origins, because even though our digital computers are relatively new, the need for computation is not.

• ## Electronic Computing: Crash Course Computer Science #2

We ended last episode at the start of the 20th century with special purpose computing devices such as Herman Hollerith’s tabulating machines. But the scale of human civilization continued to grow, as did the demand for more sophisticated and powerful devices. Soon, these cabinet-sized electro-mechanical computers would grow into room-sized behemoths that were prone to errors. But it was these computers that would help usher in a new era of computation - electronic computing.

• ## Boolean Logic & Logic Gates: Crash Course Computer Science #3

Today, Carrie Anne is going to take a look at how those transistors we talked about last episode can be used to perform complex actions. With just two states, on and off, the flow of electricity can be used to perform a number of logical operations, which are guided by a branch of mathematics called Boolean Algebra. We’re going to focus on three fundamental operations - NOT, AND, and OR - and show how they were created in a series of really useful circuits. And its these simple electrical circuits that lay the groundwork for our much more complex machines.

• ## Representing Numbers and Letters with Binary: Crash Course Computer Science #4

Today, we’re going to take a look at how computers use a stream of 1s and 0s to represent all of our data - from our text messages and photos to music and webpages. We’re going to focus on how these binary values are used to represent numbers and letters, and discuss how our need to perform operations on larger and more complex values brought us from our 8-bit video games to beautiful Instagram photos, and from unreadable garbled text in our emails to a universal language encoding scheme.

• ## How Computers Calculate - the ALU: Crash Course Computer Science #5

Today we're going to talk about a fundamental part of all modern computers. The thing that basically everything else uses - the Arithmetic and Logic Unit (or the ALU). The ALU may not have to most exciting name, but it is the mathematical brain of a computer and is responsible for all the calculations your computer does! And it's actually not that complicated.

• ## Registers and RAM: Crash Course Computer Science #6

Today we’re going to create memory! Using the basic logic gates we discussed in episode 3, we can build a circuit that stores a single bit of information, and then through some clever scaling (and of course many new levels of abstraction) we’ll show you how we can construct the modern random-access memory, or RAM, found in our computers today.

• ## The Central Processing Unit (CPU): Crash Course Computer Science #7

Today, we’re going to build the ticking heart of every computer - the Central Processing Unit or CPU. The CPU’s job is to execute the programs we know and love - you know, like GTA V, Slack... and Power Point. To make our CPU, we’ll bring in our ALU and RAM we made in the previous two episodes, and then, with the help of Carrie Anne’s wonderful dictation, (slowly) step through some clock cycles.

• ## Instructions & Programs: Crash Course Computer Science #8

Today, we’re going to take our first baby steps from hardware into software! Using that CPU we built last episode, we’re going to run some instructions and walk you through how a program operates on the machine level. We'll show you how different programs can be used to perform different tasks, and how software can unlock new capabilities that aren't built into the hardware.

• ## Advanced CPU Designs: Crash Course Computer Science #9

So now that we’ve built and programmed our very own CPU, we’re going to take a step back and look at how CPU speeds have rapidly increased from just a few cycles per second to gigahertz! Some of that improvement, of course, has come from faster and more efficient transistors, but a number hardware designs have been implemented to boost performance.

• ## Early Programming: Crash Course Computer Science #10

Since Joseph Marie Jacquard’s textile loom in 1801, there has been a demonstrated need to give our machines instructions. In the last few episodes, our instructions were already in our computer’s memory, but we need to talk about how they got there - this is the heart of programming. Today, we’re going to look at the history of programming.

• ## The First Programming Languages: Crash Course Computer Science #11

So we ended last episode with programming at the hardware level with things like plugboards and huge panels of switches, but what was really needed was a more versatile way to program computers - software!

• ## Programming Basics: Statements & Functions: Crash Course Computer Science #12

Today, Carrie Anne is going to start our overview of the fundamental building blocks of programming languages. We’ll start by creating small programs for our very own video game to show how statements and functions work. We aren’t going to code in a specific language, but we’ll show you how conditional statements like IF and ELSE statements, WHILE loops, and FOR loops control the flow of programs in nearly all languages, and then we’ll finish by packaging up these instructions into functions that can be called by our game to perform more and more complex actions.

• ## Intro to Algorithms: Crash Course Computer Science #13

Algorithms are the sets of steps necessary to complete computation - they are at the heart of what our devices actually do. And this isn’t a new concept. Since the development of math itself, algorithms have been needed to help us complete tasks more efficiently. Today we’re going to look at a couple of modern computing problems, like sorting and graph search, and show how we’ve made them more efficient so you can easily find cheap airfare or map directions.

• ## Data Structures: Crash Course Computer Science #14

Today we’re going to talk about how we organize the data we use on our devices. You might remember last episode we walked through some sorting algorithms, but skipped over how the information actually got there in the first place! And it is this ability to store and access information in a structured and meaningful way that is crucial to programming. From strings, pointers, and nodes, to heaps, trees, and stacks, get ready for an ARRAY of new terminology and concepts.

• ## Alan Turing: Crash Course Computer Science #15

Today we’re going to take a step back from programming and discuss the person who formulated many of the theoretical concepts that underlie modern computation - the father of computer science himself: Alan Turing. Normally, we try to avoid “Great Man" history in Crash Course because, truthfully, all milestones in humanity are much more complex than an individual or single lens.

• ## Software Engineering: Crash Course Computer Science #16

Today, we’re going to talk about how HUGE programs with millions of lines of code like Microsoft Office are built. Programs like these are way too complicated for a single person, but instead require teams of programmers using the tools and best practices that form the discipline of Software Engineering. We'll talk about how large programs are typically broken up into into function units that are nested into objects known as Object Oriented Programming, as well as how programmers write and debug their code efficiently, document and share their code with others, and also how code repositories are used to allow programmers to make changes while mitigating risk.

• ## Integrated Circuits & Moore’s Law: Crash Course Computer Science #17

So you may have heard of Moore's Law and while it isn't truly a law it has pretty closely estimated a trend we've seen in the advancement of computing technologies. Moore's Law states that we'll see approximately a 2x increase in transistors in the same space every two years, and while this may not be true for much longer, it has dictated the advancements we've seen since the introduction of transistors in the mid 1950s. So today we're going to talk about those improvements in hardware that made this possible - starting with the third generation of computing and integrated circuits (or ICs) and printed circuit boards (or PCBs). But as these technologies advanced a newer manufacturing process would bring us to the nanoscale manufacturing we have today - photolithography.

• ## Operating Systems: Crash Course Computer Science #18

So, as you may have noticed from last episode, computers keep getting faster and faster, and by the start of the 1950s they had gotten so fast that it often took longer to manually load programs via punch cards than to actually run them! The solution was the operating system (or OS), which is just a program with special privileges that allows it to run and manage other programs. So today, we’re going to trace the development of operating systems from the Multics and Atlas Supervisor to Unix and MS-DOS, and take at look at how these systems heavily influenced popular OSes like Linux, Windows, MacOS, and Android that we use today.