It's highly recommended to not edit this document. Any modification made will eventually be lost because it cannot reach back the owner replication instance of that document.
Experimental data is recorded as HDF files[link] on the GPFS file system[link]. The access rights are linked to the user's DESY account and can be managed by the PI via the GAMMA portal[link]. The experimental data can be downloaded via the GAMMA portal, but it is advised to use the DESY computing infrastructure. Access point are via ssh, Maxwell-Display Server[link] or JuyterHub[link]. We recommend using the JupyterHub for data exploration and the SLURM resources for high performances computing - see FAB for easy usage of the infrastructure.

older ideas ...
(object oriented) https://gitlab.desy.de/christopher.passow/fdh-builder
Short descriptions including Links: → as Text
explain install from channel instead of fixed environment, but can use environment file from example repository
distribution
environment file (repository with examples)
Documentation
here VS repository VS sphinx
Screencast
use hdfview plugin in jupterLab
create conda env with flashh5
conda create -n flashh5 python=3.10 # 3.10 not necessary, but would prefer 3.8 or higher
source activate flashh5
conda install ipython numpy pandas #TODO: fix dependcies
conda install -c https://www.desy.de/~cpassow/condarepo/ flashh5
## on jhub
conda install ipykernel
python -m ipykernel install --user --name flashh5 --display-name "flashh5"
## to remove on jhub
## delete from: /home/$USER/.local/share/jupyter/kernels/
moved to repository?
class RunDirectory:
def get_run_table(): # more or less information? RunComment | Number of Files | start & stop time ?
...
def get_run(daq, run_number): # daq is not needed!
...
class Run: # constructor optional without RunDirectory or use there self.path
def get_files():
...
def get_channels(): # of file #1
...
def get_start_time(): # better as attribute?
...
def get_stop_time(): # which? | better as attribute?
...
def to_df(daq_map): # to_df(daq_map, slice) slice=[0:4] -> throw Exception
...
def to_series(channel):
...
def to_array(channel):
...
ideas
run.to_df(daq_map)
run.to_series(daq_adr or daq_map) # on channel only?
run.to_array(daq_adr) # on channel only?
## interesting?
# run.to_dask(daq_map)
# run.to_xarray(daq_map)