@John E Savage, 2003. 16. 1-hot Array Programming is. NP-hard Under Stores Alone. Reduction is from set-basis to ap set basis. Instance: Triples (U,S,k) ...
Computing with Electronic Nanotechnologies John E Savage Brown University
@John E Savage, 2003
1
The End of Photolithography 2001 ITRS (Roadmap) predicts within 10-15 years “most known technological capabilities will approach or have reached their limits.” Nanotechnology will replace photolithography Nanotechnology manufacturing will involve stochastic self-assembly. Chemistry will replace physics @John E Savage, 2003
2
Development of Nanotechnologies Carbon nanotubes and semiconducting nanowires (NWs) have been exhibited. The crossbar is a promising technology.
@John E Savage, 2003
3
Construction and Data Storage in Nanoarrays Crossbars will be constructed using directed self-assembly. Technologies developed for data storage at crossbar junctions: Switchable electronic molecular layers. Mechanical contact between NWs. Nantero claims 1010 bits/wafer with suspended carbon nanotubes. @John E Savage, 2003
4
Challenges of Nanoarrays
Challenges Controlling many nanowires with small number of microwires is a necessity. Large capacities demand efficient programming @John E Savage, 2003
5
h-hot Addressing of Nanowires with b Microwires NW has h control regions; NW conducts if address wires associated with its h regions are “hot”. NWs conduct if k address wires active.
Address microwires @John E Savage, 2003
6
Manufacture of ModulationDoped Nanowires Semiconducting NWs grown from seed catalysts; its diameter controlled by seed. Changing the gaseous mixture as NWs grow allows p-type regions to be doped axially. NW grows here
@John E Savage, 2003
7
Self-Assembly of Modulation-Doped Nanowires Liquid containing many copies of each h-hot address is poured into a trough on a chip.
NWs self-assemble into parallel locations. NW codes are chosen at random. @John E Savage, 2003
8
Stochastic Assembly of Nanoscale Interfaces Construct decoder by placing NWs above an orthogonal set of address microwires. Issues: Under what conditions will randomly chosen NWs have different codes? How can the chosen codes be discovered? How can misalignment between control regions and address wires be handled? How to cope with incomplete set of codes in trough? @John E Savage, 2003
9
Stochastic Assembly of Nanoscale Interfaces (cont) Very probably the C NWs in trough have distinct codes if starting number of distinct codes > 100C. Discover absent codes by testing if 1s stored. Coping with alignment errors: When offsets large: repeat code along length of NW When offsets small: make length of control region less than separation between microwires
Translate complete set of external codes into incomplete internal set with micro-level circuit. @John E Savage, 2003
10
Array Programming Goal: given array of 1s and 0s, program crossbar in small number of steps. One programming step: write 1s (stores) or 0s (restores) into a subarray Questions: How difficult is it to find optimal programs? What are most efficient ways of entering data? Do restores help? @John E Savage, 2003
11
Array Programming Decision Problem array programming (ap) Instance: (W,k) where W is an n n array over {0,1} and k is an integer. Answer: “Yes” if there exists a set of at most k stores and restores that cover W. Goal: Determine if (W,k) is a “Yes” instance when h-hot addressing used. @John E Savage, 2003
12
Approach to Study of h-hot Array Programming Start with 1-hot case; extend to h-hot Show most ap problems are NP-hard with stores and restores Show hard to approximate in polynomial time with stores alone unless P = NP Explore conditions for good approximations Develop algorithms for structured arrays Helps to understand model Restores are powerful – many fewer steps @John E Savage, 2003
13
Approach to Study of h-hot Array Programming (cont) Row programming is an important alternative (still hard) When is Array programming approximable? Stores and restores done on bounded number of rows and columns (a technology limit) Each row and column has bounded number of 1s (restriction on problems considered) h is about b/2 (design specification) @John E Savage, 2003
14
Most Array Programming Problems are Hard Counting arguments show most n n arrays require many steps: > (1-ε) n/2 Ω(n1.5) Ω(n5/3) Ω(n2/log2 n) Compare when n = 1010 or 1011
when h =1 when h = 2 when h = 3 when h = O(log2 n)
Many structured problems are not hard. NP-hard problems may not be among worst-case. @John E Savage, 2003
15
1-hot Array Programming is NP-hard Under Stores Alone Reduction is from set-basis to ap set basis Instance: Triples (U,S,k) where U = {e1, …,em}, S = {S1, …, Sn}, Si U, |Si| b, & k an integer Answer: “Yes” if (U,S,k) has basis of size l b, that is, B = {B1, …, Bl}, Bi U, such that for each i, Si is the union of some sets in B. Theorem (Stockmeyer ’75) set basis is NP-complete. @John E Savage, 2003
16
Equivalence of set basis and array programming Instance of set basis
Instance of array programming
@John E Savage, 2003
17
1-hot Array Programming is NP-hard Under Stores (cont) From (U,S,k), create n m array D, a column for every element in U and a row for every set in S. Bi identifies a subarray of rows j for which Bi Sj. Since Sj is represented as union of basis sets, 1s in D covered by stores on subarrays defined by Bi ‘s. (U,S,k) is a “Yes” instance of set basis if and only if (D,k) is a “Yes” instance of ap. Q.E.D. @John E Savage, 2003
18
Inapproximability of 1-hot Array Programming Under Stores array programming is also equivalent to covering by complete bipartite subgraphs which is ratio-reducible to clique partition. Since clique partition is hard to approximate the same is true of array programming.
@John E Savage, 2003
19
1-Hot Array Programming is NP-hard Under Stores & Restores a)
b)
c)
ap-log: instances of ap when k = O(log n). Show aplog is NP-complete. Reduce ap-log with stores to ap with stores and restores. Find m m array whose optimal algorithm under stores and restores uses log m stores. Given instance of ap-log under stores, construct instance of ap under stores and restores with don’t cares .
@John E Savage, 2003
20
Programming Special Structured Arrays under 1-hot We show that some n n arrays found in images or used in crossbars have efficient programs. Array Diagonal Lower full Banded of bandwidth β s-sparse ( s 1s per row/column)
Time 2 log2 n 2 log2 n O(log n) O(s2log2 n)
Theorem Lower bounds show first three upper bounds tight @John E Savage, 2003
21
Recursive Program for Diagonal Array
T(2) = 2 @John E Savage, 2003
1
0
0
0
1 0
0
0
0
0
1
0
0
0
1 0
0
0
0
0
1
0
0
0
1 0
0
0
0
0
1
0
0
0
1 0
1 0
0
0
0
1
0
0
0
0
1 0
0
0
0
1
0
0
0
0
1 0
0
0
0
1
0
0
0
0
1 0
0
0
0
1
T(2k) = T(2k-1) + 2
T(2k) = 2k 22
Complete Program for Diagonal Array
@John E Savage, 2003
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
23
Complete Program for Diagonal Array
@John E Savage, 2003
1
0
1
0
1
0
1
0
0
0
0
0
0
0
0
0
1
0
1
0
1
0
1
0
0
0
0
0
0
0
0
0
1
0
1
0
1
0
1
0
0
0
0
0
0
0
0
0
1
0
1
0
1
0
1
0
0
0
0
0
0
0
0
0
24
Complete Program for Diagonal Array
@John E Savage, 2003
1
0
1
0
1
0
1
0
0
1
0
1
0
1
0
1
1
0
1
0
1
0
1
0
0
1
0
1
0
1
0
1
1
0
1
0
1
0
1
0
0
1
0
1
0
1
0
1
1
0
1
0
1
0
1
0
0
1
0
1
0
1
0
1
25
Complete Program for Diagonal Array
@John E Savage, 2003
1
0
0
0
1
0
0
0
0
1
0
0
0
1
0
0
1
0
1
0
1
0
1
0
0
1
0
1
0
1
0
1
1
0
0
0
1
0
0
0
0
1
0
0
0
1
0
0
1
0
1
0
1
0
1
0
0
1
0
1
0
1
0
1
26
Complete Program for Diagonal Array
@John E Savage, 2003
1
0
0
0
1
0
0
0
0
1
0
0
0
1
0
0
0
0
1
0
0
0
1
0
0
0
0
1
0
0
0
1
1
0
0
0
1
0
0
0
0
1
0
0
0
1
0
0
0
0
1
0
0
0
1
0
0
0
0
1
0
0
0
1
27
Complete Program for Diagonal Array
@John E Savage, 2003
1
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
1
0
0
0
0
1
0
0
0
1
0
0
0
0
1
0
0
0
1
0
0
0
0
1
0
0
0
1
0
0
0
0
1
0
0
0
1
28
Complete Program for Diagonal Array
@John E Savage, 2003
1
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
1
29
Summary of 1-hot Results NP-hard to both solve and approximate ap with stores alone NP-hard to solve ap with stores and restores Open whether ap can be approximated in Ptime when both stores and restores used Prototypical structured arrays under stores and restores can be programmed much more efficiently than under stores alone. @John E Savage, 2003
30
h-hot Array Programming Theorem h-hot array programming under stores and restores is NP-hard. Proof Embed an instance of 1-hot in h-hot. Theorem (log n)-hot programming of n n arrays is logapproximable. Proof O(log n) address wires can activate n NWs. Represent problem as instance of set cover. Cleverness is in representation of universe and sets. @John E Savage, 2003
31
h-Hot Row Programming Under Stores Alone h-hot row programming (h-hrp) Instance: (S,h,b,k), integers in S addressed by h-hot scheme on b addresses, k an integer Answer: “Yes” if k stores cover all entries in S. Theorem h-hrp is not approximable under stores alone in polynomial time unless P = NP. Proof Implied by bounded-ratio reduction between clique cover and h-hrp. @John E Savage, 2003
32
h-Hot Row Programming Under Stores and Restores Theorem Exists bounded-ratio reduction from 1-hot array programming to 2-hot row programming. Proof Represent each integer (NW) in S by pair of integers. Embed 1-hot ap in upper right quadrant
@John E Savage, 2003
33
h-Hot Row Programming Under Stores and Restores Theorem h-hot row programming is NPcomplete under stores and restores. Proof Follows from above reduction. Very important to know if good approximations are possible under stores and restores. @John E Savage, 2003
34
Programming Special Cases Under h-Hot Addressing Use nested blocks model: Divide b address wires into h sets of b/h wires. ith component of address h-tuple is from ith set. Array Diagonal Lower full Banded,bandwidth β s-sparse @John E Savage, 2003
Time 2 log n +2h 2*3h (log n)/h Ο(3h(1+β/n1-1/h)(log n)/ h) No results 35
Heuristic for Array Programming under Stores and Restores
Reverse to program W. Goal: find large subarrays of 1s (0s) ignoring don’t cares. @John E Savage, 2003
36
Finding Large Subarray of 1s in a W having 1s, 0s and Don’t Cares Under 1-hot given W, construct G = (V,E) so that there is a 1to-1 correpondence between cliques in G and subarrays in W. Use max clique heuristic when edge weights are 0 and 1 (0s correspond to don’t cares) to find large subarrays. Let V = {vi,j | wi,j = 1}; E has edge between vi,j and vk,l when wi,j and wk,l can be in same subarray. Under h-hot put edge in E if vertices can be in a subarray addressable with h-hot addressing. @John E Savage, 2003
37
Conclusions The expected size of nanoarrays is an opportunity and a challenge. Efficient programs will be needed. Small constant factor improvements are important. While array programming is NP-hard and hard to approximate using only stores, many structured problems are easy. Good heuristics, perhaps based on max clique may be helpful. @John E Savage, 2003
38
Open Problems Find good P-time approximation schemes for h-hot array programming under stores and restores or show not possible unless P = NP. Writing 1s may take 103 times the time to write 0s (DeHon). How does this change the problem? Analyze the performance of heuristics of the type shown above to program arrays with stores and restores. @John E Savage, 2003
39
Credits The research on stochastic assembly of address decoders is joint with Andre’ DeHon of Caltech and Patrick Lincoln of SRI. The research on array programming is joint with Lee-Ad Gottlieb and Arkady Yerukhimovich.
@John E Savage, 2003
40
Subarrays Correspond to Cliques in Graph
@John E Savage, 2003
41