You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Haha, I get the feeling that I have weird use cases....
The following should return spike times in seconds, but, alas, they're saved as ints and the ticks of the range dimension is only using the raw linked data.
np.random.seed(1234)
timestamps = np.sort(np.random.choice(np.arange(1000000, dtype=int), 10000, replace=False))
fs = 40000
f = nixio.File.open("test.nix", "w")
b = f.create_block("block0", "foo")
da_ts = b.create_data_array("spike_times", "foo", dtype=np.uint64, data=timestamps)
da_ts.polynom_coefficients = (0, 1 / fs)
da_ts.unit = "s"
da_ts.label = "time"
dim = da_ts.append_range_dimension()
dim.link_data_array(da_ts, [-1])
pos = b.create_data_array("pos", "foo", data=[3.0, 4.0])
ext = b.create_data_array("ext", "foo", data=[0.4, 0.4])
mt = b.create_multi_tag("mt", "foo", positions=pos)
mt.extents = ext
mt.references.append(da_ts)
# it should return the scaled values
print(da_ts[np.logical_and(da_ts[:] >= 3., da_ts[:] <= 3.4)])
try:
print("try...")
# but it doesn't
print(mt.tagged_data(0, "spike_times")[:])
print("success!")
except:
print("aliasrangedim failed")
# a normal range dim works, because the timestamps are stored twice
da_ts.delete_dimensions()
dim = da_ts.append_range_dimension(ticks=da_ts[:], label="time", unit="s")
print(mt.tagged_data(0, "spike_times")[:])
I guess if I have to save the timestamps twice it's more efficient to actually use floats and and then create the alias range dimension...
Haha, I get the feeling that I have weird use cases....
The following should return spike times in seconds, but, alas, they're saved as ints and the ticks of the range dimension is only using the raw linked data.
I guess if I have to save the timestamps twice it's more efficient to actually use floats and and then create the alias range dimension...
Might be related to #482
The text was updated successfully, but these errors were encountered: