Skip to content
dann toliver edited this page Mar 1, 2014 · 8 revisions

Mutable Tables (3.3.3) and Circuit Simulation (3.3.4)

We had a great session on Friday, we discussed some alternative implementations of mutable lists, and covered how we would implement a binary search tree in scheme. Following that we dove into section 3.3.4 and Circuit Simulation. In our discussions we spent a lot of time discussing where this event driven model comes from, in the text they seem to assume you know this. We discussed how the event driven model arises as an optimization in order to simulate digital circuits. In the real circuit, the clock signal determines the flow of information through the chip, and within each clock cycle a signal travels across every wire, however often the signal being sent is the same as in the last clock cycle so no changes arise in the state of the system. State Changes only occur when one of the inputs changes, so all we really need to do is track when changes occur, which leads to the event driven model. Below I've pasted in the notes I made while reading, in particular the digital circuit models lead to interesting and non-intuitive behaviour.

Until next time,

Scott

Sicp notes section 3.3 Exercise 3.24 asks to match numerical keys within a tolerance but they aren't sorted or anything. This means you get the first matching record back which may not be unique. That will make your table behave a bit strangely, I.e inserting 1,2, 3 and then retrieving 2 with tolerance 1 gives 3, but insert in the opposite order and you get 1.

Exercise 3.26 is implement the 1D table as a binary tree, I hope someone does this exercise but I don't have time.

Exercise 3.27 is the memorization approach to computing Fibonacci numbers. It definitely won't work if you memoize the original, the way it works they create the memoize table in the environment inside memo-fib, the original would not look for the table. Since you always look up the previous result you don't recompute every time. In the expanding tree every sub tree with a matching number only gets computed once, instead of many times. It does mean that you need efficient table look ups but since they are integers you can index into an array.

Section 3.4 This is not a toy problem, at the time this may have been state of the art. Event driven analysis is still a fundamental part of designing digital circuits today.

As an example if there is an Or gate with a delay of 5, and at time 0 both inputs are off, but at time 1 one input is turned on. If we always sent all signals at every time step, at time 1, w send 0 and 0 into the or gate and 0 out, at time 1 we send 1 and 0 in, 0 out. Time 2 1 and 0 in, 0 out, ... So we send a lot of repeated messages and we send identical messages when waiting on a delay. The event driven model reduces this workload by identifying when changes occur and only propagating those changes.

Parallelizing an event driven model like this is very difficult, the reason being that you don't know a priori whether processing the latest event will create a new event at a time newer than the next one on the event stack and that new one may influence the result of the following one.

|A (0) | - | - | - | And(delay=3) | - |D (0)| | | | | | | | | | B (0) | - | Inverter(delay=1) | - | C (1) | / | |

The sketch is supposed to show 2 inputs A and B. B goes into an inverter whose result is C. A and C are the inputs to the And-gate and D is the output. If you flip A and B at time 1, then Both the inverter and the And go on the stack at time 1. This is not a dag, the intuitive interpretation is that the And should wait to trigger until all of its inputs are done, but we don't have any way to know that in this model. Instead we just capture this as a new independent event. Below examining the state as these events are applied, using the notation that an Event triple is (time, node, value), and the state is a quadruple (A, B, C, D).

Events:  (1, A, 1), (1, B, 1)
State:  (A -> 0,  B -> 0, C -> 1, D -> 0 )```

We process the first event and A turns to 1, this triggers the and gate and updates based on the current A and C values, so we insert an event to change D to 1 at time 4 (1 plus delay 3).  We still have events at time 1 so we don't tick the clock forward yet.

```Time 1:
     Events: (1, B, 1), (4, D, 1)
     State: (1, 0, 1, 0)```

We process the next event and change B to 1.  This triggers the inverter and we schedule a new event for C to change to 0 at time 2 since the inverter delay is 1.  This is then the newest event so we can tick forward to time 2.

```Time 2:
    Events: (2, C, 0), (4, D, 1)
    State: (1, 1, 1, 0)```

Applying the next event we change C to 0, and trigger the and to update, this sees A as 1 but now C is zero, so we schedule D to change to zero at time 5.

```Time 4: 
     Events: (4, D, 1), (5, D, 0)
     State: (1, 1, 0, 0)```

Apply the next event and update D,

```Time 4: 
     Events: (5, D, 0)
     State: (1, 1, 0, 1)```

Now apply the last one and update the global clock again,

```Time 5: 
     Events: 
     State: (1, 1, 0, 0)```

This shows how in the process of arriving at the final state, there can be possibly unexpected intermediate states, I.e. D=1 for only one time unit.  Also it's difficult to do the event processing in parallel, for instance in the second time 1 state you can't process the turn D to 1 event and tick the clock forward to time 4 until after the turn B to 1 event is processed.  It doesn't really make sense conceptually to do so, on the other hand you can process events at the same time in parallel.

Page 505, now our digital circuit representation takes on the role of language, referring back to section 1.1 is instructive, but also somehow confusing.  It means with a few routines in our current language we have created another language, and then it makes you think back, what were we tacitly accepting as givens at the outset?  If I was taking ECE 115 we would learn this higher level language first, similarly we simply accepted that we have functions, numbers, symbols, and types.  For programmers the syntax of those constructs forms the language, but where does it start?

Page 512, Footnote 27 the distinction between procedures and data is syntactic sugar, adjust as you please.


-------- dann notes -------

Great notes, Scott! I figured I'd tack on a list of a few off-topic topics we discussed: Dwarf Fortress, GreenArrays GA144, the time complexity of binary key/value tables built using cons cells, the difficulty of building a constant-time random access array using cons cells, how hash tables work, rebalancing of trees and hashes, vacuum tube computers, ternary computers, decimal computers, building logic gates from transistors, transistor saturation, combinatorial explosion for hardware testing, parallelizing hardware testing, timing concerns in hardware and dataflow modeling, NAND/XOR basis for circuits, SKI combinator basis for programming, whitespace language, Idris, dependent types.
Clone this wiki locally