Notes - Pages 13-25

SIMD MIMD
Vector computers need only indexed memory and operations. MIMD requires new primitives like fork/join processes and shared/private memory
14-15
Statement mutual independence
All variables read is its Input Set I(S), all written to Output Set O(S), two statements are independent if intersection of their output sets are the empty set, I(S_j) intersection O (S_i) is empty set, and I(S_i) intersection O(S_j) is empty set... these three properties 'output independence', 'flow independence', and 'anti independence' these are known as Bernsteins Conditions
17
Date dependency graphs
a way to characterize the parallelism of a algorithm, and how to reorganize the computation to improve that paralelization. From linear additions, to binary tree of additions, same # of computations but the binary tree form is better on SIMD/MIMD architectures
24-25

Parallelism is commutative but not tranisitive (18), i.e. statement independence is a N^2 check. The section read had architecture graphs, psuedocode definitions, primitves, and conditions that make paralizing statements possible, where statements can be any unit of computation...