python reminder
generator for integrals
inte = sum(f(xvec) * xweight)
is much faster than
inte = 0
for i1, xval in enumerate(xvec):
inte += f(xval) * xweight[i1]
generator for nested sum
cast a nested sum into one big iterator:
- simple case:
N1 = 5
N2 = 8
p1_arr = np.linspace(0.8, 1.2, N1)
p2_arr = np.linspace(3.8, 4.5, N2)
grid = (
(i1, i2)
for i1 in range(N1) for i2 in range(N2)
)
ans = (
_f(p1_arr[i1], p2_arr[i2])
for i1, i2 in grid
)
ans = sum(ans)
- further processing:
N1 = 10
N2 = 80
para1_arr = np.linspace(0.8, 1.2, N1)
para2_arr = np.linspace(3.6, 4.0, N2)
grid = (
(i1, i2)
for i1 in range(N1) for i2 in range(N2)
)
pool = [
[fitfunc([para1_arr[i1], para2_arr[i2]]), i1, i2]
for i1, i2 in grid
]
pool = np.array(pool).T
_ii = np.argmin(pool[0])
i1, i2 = int(pool[1][_ii]), int(pool[2][_ii])
print(para1_arr[i1], para2_arr[i2])
passing tuple as index for array
lat[tuple([*xvec, d])]
lat[xvec[0], xvec[1], xvec[2], xvec[3], d]
data stacking
data = vstack((data, new_col))
filename versioning
filename=f'file_{int(num):03}'
watch out for side effects
import numpy as np
# test 1
xarr = [1, 2, 3]
yarr = [2, 4, 6]
a = xarr
a += yarr
# Guess what xarr becomes now?
# test 2
xarr = np.array([1, 2, 3])
yarr = np.array([2, 4, 6])
a = xarr
a += yarr
# Guess what xarr becomes now?
This can be a hard bug to track down!
use a dictionary to assess and update global observables
adding NaN to signal discontinuities
# some condition to decide
iloc = np.argmax(data[2])
# tag: adding nan for discontinuity
data = np.insert(data, iloc, np.nan, axis=1)