0:11

Really, in the first two weeks, we've completely ignored the issue of time.

Everything happened immediately, there were inputs and

immediately the outputs were some function of them.

The inputs never change, we didn't have any notion of change.

We didn't have any notion of things happening one, one thing after another.

Just was some kind of logical mapping from inputs to outputs.

But, that is of course, not what really happens in real life.

We need to be able to take care of time.

We need to be thi, to think about time because computers work during time.

So, what kind of thing do we need from time.

Well, on the positive side, there are two issues that we want to do.

The first this is we want to a, be able to

use the same hardware to compute many things one time after another.

If we know how to add two numbers,

we don't want to just use a certain piece of hardware to just once add two numbers.

We want to use the same piece of hardware to add numbers every time we have

two numbers that we need to add.

We just need to add them one after the other.

So, we need to be able to reuse our hardware.

For example, if we take some kind of loop in a hard,

in a, in some kind of software program, that the loop calls for doing the same

thing many times, we want to be able to use a same hardware to do that.

Another important thing that we need time for

is to actually remember things from the past.

When we need to remember intermediate results,

we need to remember where we are in a computation.

For example, if we have a loop that adds 100 numbers to each other, and

gets a total sum.

We of course will need to add the intermediate sums.

Otherwise, there's no way for us to actually get, build up the total sum.

1:46

And, the, these are the two things that we really want to think about time for.

And yet, there's another thing that we really don't want to think about time for,

but we need to basically get, you know,

sweep it under the rug in a satisfactory manner.

And, that is a fact that computers work at some finite speed.

And, we need to be able to make sure that we don't

ask the computer to perform computations faster than it can.

Or, that at least, we know how fast the, can our computer can run.

2:36

What we're going to do what everyone does in computer science

we're going to actually convert this continuous physical time,

which is very complicated to think about into discreet time.

We're going to have what's called a clock,

which is some kind of of oscillator going up and down at a certain fixed rate.

And, each cycle of the clock we're going to treat as one digital integer time unit.

So, once we have this clock,

it will basically break up our physical continuous time

into a sequence of time equals one, time equals two, time equals three, and so on.

Within each time unit, we're going to deal as a time it was in each time unit

as though it was one indivisible thing.

Nothing changes within a time unit, within an integer time unit.

So for example, if we have a nut gate, for example.

If we look at it's input and it's output, at every different time unit,

it can have a different input.

And, at that time unit, it will compute the output from that input

in an instantaneous manner, as we think about it.

Every time unit the input could change, and then the output would follow.

So, if you look at the diagram we see here,

we see that in time 1 the input is 1.

The output is then of course 0 because that's what an out gate does.

At time 2, the input was redu, was go, went down to 0.

For example, it could be state one, but

if it went out of zero, immediately the output goes up to one and so on.

4:18

When you really look at a physical signal that's really eh,

implemented somehow maybe, probably with some electrical signals.

It doesn't change instantaneously in zero time between time, logical time one and

logical time two, between logical time zero and logical time one.

Really, the little course current builds slowly and

the voltage may be that's how we present the actual bit.

It changes slowly.

And then, what we really see if we look at the,

at the, actual analog signal inside our implementation of the gate, it's

going to be some kind of way form that we see here in, in unit time unit one.

It takes time for the input to reach it's final stage.

4:55

And then, it will also take some time for the output to reach the final stage.

Probably, it will take more time than it takes for the input because there's

an addit, additional delay, the delay of the ligated self.

The who, the whole point of this logical we we treat, we break time into digital,

into integer units is the fact that they won't want to think about these delays.

As long as our clock cycle is not too fast, as long as we take,

give ourselves enough time between consecutive time units,

we can ignore everything that happened at the beginning of the cycle.

All the grey area here.

And, as long as by the end of the gray area, all the signals reach their true and

final and consistent state we're all done.

In fact, the way we choose the cycle of the clock

is to make sure that all the hardware there really stabilizes.

And, the implementations give you the logical operations

by the end of the gray unit, and the clock need to be a little bit wider than that.

So really, in reality,

there is going to be all these gray areas where voltages changes and

the system basically try's to stabilize into the new logical eh, situations.

And, what happens is at the end of the clock cycle

is what we view as the real logical state of the system.

And then, we can simply ignore these inconsistencies because now,

whatever happened before the gray area, we don't need to

worry about it because we know it's gone by the end of the clock cycle.

So, that ends our, the way that we sweep under the rug the issue of delays.

And, that really justifies the way that we can think about time in integer steps,

one after another.

So now, we've reached the situation that we have integer time units and

we know that something can happen at every different time unit.

Under this prospective, all the logic that we talked about in lectures one and

two is what's combinatorial.

The output of time t completely dependent of the input at time t.

There was now information moved from t minus 1 to time t.

What we're talking about when we say sequential logic, what we mean is,

that the information of time t, the output at time t,

depends on the input at time t minus 1.

We remember things from last period, and, and

based on the previous input, the previous time step.

We actually compute our output at the new timestamp.

So, if we look at the diagram of time units one, two, three, four, and

if the input is a, b, c, d, whatever input.

It can be a single bit, it can be multi-bus bit.

In combinatorial logic, at each time you point,

a was converted to some function of a.

So, the output at time two was the in, was the function of the input at time two.

And, each time unit we could have a different input and different output.

When we have sequential logic the output at time t,

it depends on the input at time t minus one, and what happened previously.

In principle, we could also have it depend on what is happening now at time t itself,

but it's probably best to actually cre,

to actually split the two types of computations we do.

The Combinatorial Logic, which happens instantaneously within a time unit.

And, the Sequential Logic,

where we don't want to mix new information with old information usually.

So, we just think about that time and

time t plus 1 depends on the input the previous input at the time t.

8:24

Once we have this point of view, we can do a really interesting point of view.

We can actually have the input and output be the same but, bits.

The same busses the same location than our hardware.

We and, we can now recall it to state.

As long as now what's our, our value at time t depends on the previous value at

time t minus 1, we can have these values live in the same wires in the circuit.

So now, we can have a single bit, for example, that holds a called state.

And if at time one, it was some kind of input a,

at time two, it will be some kind of function of a, f of a.

At time 3 it will be some kind function, f of f of a,

basically f of the previous value that we have, and so on.

This now let's us, for the first time, actually change state as we go along,

along, never remembering and always building on all previous results.

So, this ends eh, the basic eh,

abstract way we're thinking about time in eh, in digital circuits.

The fact that we actually break it into integer time steps and

we look at what happens sequentially.

One time step after another, rather than in a continuous time.

And, what we're going to do in the next unit is actually describe the chips

that allow us to do this kind of manipulation.