@@ -6,45 +6,45 @@ Execute automated computations with `populate()`.
66
77``` python
88# Populate all missing entries
9- ProcessedData .populate()
9+ SessionAnalysis .populate()
1010
1111# With progress display
12- ProcessedData .populate(display_progress = True )
12+ SessionAnalysis .populate(display_progress = True )
1313```
1414
1515## Restrict What to Compute
1616
1717``` python
1818# Only specific subjects
19- ProcessedData .populate(Subject & " sex = 'M'" )
19+ SessionAnalysis .populate(Subject & " sex = 'M'" )
2020
2121# Only recent sessions
22- ProcessedData .populate(Session & " session_date > '2026-01-01'" )
22+ SessionAnalysis .populate(Session & " session_date > '2026-01-01'" )
2323
2424# Specific key
25- ProcessedData .populate({' subject_id' : ' M001' , ' session_idx' : 1 })
25+ SessionAnalysis .populate({' subject_id' : ' M001' , ' session_idx' : 1 })
2626```
2727
2828## Limit Number of Jobs
2929
3030``` python
3131# Process at most 100 entries
32- ProcessedData .populate(limit = 100 )
32+ SessionAnalysis .populate(limit = 100 )
3333```
3434
3535## Error Handling
3636
3737``` python
3838# Continue on errors (log but don't stop)
39- ProcessedData .populate(suppress_errors = True )
39+ SessionAnalysis .populate(suppress_errors = True )
4040
4141# Check what failed
42- failed = ProcessedData .jobs & " status = 'error'"
42+ failed = SessionAnalysis .jobs & " status = 'error'"
4343print (failed)
4444
4545# Clear errors to retry
4646failed.delete()
47- ProcessedData .populate()
47+ SessionAnalysis .populate()
4848```
4949
5050## When to Use Distributed Mode
@@ -67,7 +67,7 @@ Choose your populate strategy based on your workload and infrastructure:
6767** Example:**
6868``` python
6969# Simple, direct execution
70- ProcessedData .populate()
70+ SessionAnalysis .populate()
7171```
7272
7373---
@@ -83,13 +83,13 @@ ProcessedData.populate()
8383
8484- Prevents duplicate work between workers
8585- Fault tolerance (crashed jobs can be retried)
86- - Job status tracking (` ProcessedData .jobs` )
86+ - Job status tracking (` SessionAnalysis .jobs` )
8787- Error isolation (one failure doesn't stop others)
8888
8989** Example:**
9090``` python
9191# Distributed mode with job coordination
92- ProcessedData .populate(reserve_jobs = True )
92+ SessionAnalysis .populate(reserve_jobs = True )
9393```
9494
9595** Job reservation overhead:** ~ 100ms per job
@@ -112,7 +112,7 @@ ProcessedData.populate(reserve_jobs=True)
112112** Example:**
113113``` python
114114# Use 4 CPU cores
115- ProcessedData .populate(reserve_jobs = True , processes = 4 )
115+ SessionAnalysis .populate(reserve_jobs = True , processes = 4 )
116116```
117117
118118** Caution:** Don't use more processes than CPU cores (causes context switching overhead)
@@ -144,10 +144,10 @@ For multi-worker coordination:
144144
145145``` python
146146# Worker 1 (on machine A)
147- ProcessedData .populate(reserve_jobs = True )
147+ SessionAnalysis .populate(reserve_jobs = True )
148148
149149# Worker 2 (on machine B)
150- ProcessedData .populate(reserve_jobs = True )
150+ SessionAnalysis .populate(reserve_jobs = True )
151151
152152# Workers coordinate automatically via database
153153# Each reserves different keys, no duplicates
@@ -157,30 +157,30 @@ ProcessedData.populate(reserve_jobs=True)
157157
158158``` python
159159# What's left to compute
160- remaining = ProcessedData .key_source - ProcessedData
160+ remaining = SessionAnalysis .key_source - SessionAnalysis
161161print (f " { len (remaining)} entries remaining " )
162162
163163# View job status
164- ProcessedData .jobs
164+ SessionAnalysis .jobs
165165```
166166
167167## The ` make() ` Method
168168
169169``` python
170170@schema
171- class ProcessedData (dj .Computed ):
171+ class SessionAnalysis (dj .Computed ):
172172 definition = """
173- -> RawData
173+ -> Session
174174 ---
175175 result : float64
176176 """
177177
178178 def make (self , key ):
179179 # 1. Fetch input data
180- raw = (RawData & key).fetch1(' data' )
180+ data = (Session & key).fetch1(' data' )
181181
182182 # 2. Compute
183- result = process(raw )
183+ result = process(data )
184184
185185 # 3. Insert
186186 self .insert1({** key, ' result' : result})
0 commit comments