Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fits meta #12

Merged
merged 22 commits into from
Nov 18, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/user-guide/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ For more details checkout the :ref:`reference`.

Brief Tour <tour>
data
raw
level0
spectrum
customization
Expand Down
102 changes: 69 additions & 33 deletions docs/user-guide/level0.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,36 +6,72 @@ Level 0 Data

Overview
========
This section document the format of the level 0 binary files.
Level 0 binary files are raw telemetry or command response packets generated by the instrument.


Photon Packet
-------------
The packet which provides individual hit data or photons for each detector.

+-------------+-----------+--------+------------------------------------------------------------+
| Name | Bit Size | Type | Description |
+=============+===========+========+============================================================+
| HEADER WORD | 16 | UINT | value is always 65131 |
+-------------+-----------+--------+------------------------------------------------------------+
| FLAGS | 16 | UNIT | various flags, To be defined |
+-------------+-----------+--------+------------------------------------------------------------+
| Packet Size | 16 | UINT | The size of the packet which can be used to determine the |
| | | | number of hits included in the packet |
+-------------+-----------+--------+------------------------------------------------------------+
| Time Stamp | 48 | UINT | |
+-------------+-----------+--------+------------------------------------------------------------+
| Checksum | 16 | UINT | For data integrity |
+-------------+-----------+--------+------------------------------------------------------------+
| Pixel data | | | This field is repeated based on the number of hits detected|
+-------------+-----------+--------+------------------------------------------------------------+


+-------------+-----------+--------+------------------------------------------------------------+
| Pixel Data | | | |
+-------------+-----------+--------+------------------------------------------------------------+
| Detector id | 16 | UINT | The detector id for the location of the hit |
+-------------+-----------+--------+------------------------------------------------------------+
| Hit energy | 16 | UINT | The ADC value for the energy of the hit |
+-------------+-----------+--------+------------------------------------------------------------+
Level 0 data are provided in the `FITS file <https://fits.gsfc.nasa.gov/>`__ format.
For more information on how to read or write FITS file see `astropy.fits <https://docs.astropy.org/en/stable/io/fits/index.html>`__.
This section describes the organization the level 0 FITS files.
Level 0 fits files generally include the unconverted data from the raw binary files of ccsds packets.
The purpose of these files is to provide the raw data from the raw binary files in a more convinient form for analysis.
It also provides metadata information which summary the data in the file.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change "summary" to "summarizes".


Level 0 event files
-------------------

This file contains the data from all events that triggered the detectors.
They consist of 3 HDUs including the primary HDU.
The primary HDU contains no data and is only used for metadata.
The two other HDUs are named `SCI` and `PKT`.
`SCI` includes the event data while `PKT` includes the packet data.
Each data packet may include one of more event therefore there is a one to many relationship between them.

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am confused by this. Should it say "Each data packet may include one or more events, therefore there is a one to many..."?

In order to understand the relationship between the events and packets, each event provides the associated packet sequence number.
This sequence number can be used to lookup the packet data for that event.

Primary HDU
***********
No data is provided.
Stay tuned for a list of metadata

PKT HDU
*******
The following columns are provided for each data packet.
The bits column provide the number of significant bits and not the bit length of the column itself.
The columns in the FITS file are provided in the smallest possible data type.

======== ============================================= ====
name bits
======== ============================================= ====
seqcount packet sequence number, should be consecutive 12
pkttimes the packet time in seconds since EPOCH 32
pktclock the packet subsecond time in clocks 32
livetime live time 16
inttime integration time in real time 16
flags flags 16
======== ============================================= ====

SCI HDU
*******
The following columns are provided for each event or photon detected.
The bits column provide the number of significant bits and not the bit length of the column itself.
The columns in the FITS file are provided in the smallest possible data type.

======== ============================================================================================ ====
name description bits
======== ============================================================================================ ====
seqcount packet sequence number 12
clocks the clock number 16
asic the asic number or detector id 3
channel the asic channel which is related to the pixel 5
atod the uncalibrated energy of the event in ADC counts 12
baseline the baseline measurement if exists, otherwise all zeros 12
pkttimes the packet time in seconds since EPOCH, also exists in PKT, 32
pktclock the packet time in clocks since EPOCH, also exists in PKT 32
======== ============================================================================================ ====

Level 0 spectrum files
----------------------
Summary spectra are created for 24 pixels at a regular cadence (normally every 10 s)
Each spectrum has a total of 512 energy bins.

Level 0 housekeeping files
--------------------------
These files contain housekeeping data as described in the housekeeping packet.
It also includes any register read responses that may exist during that time period.
41 changes: 41 additions & 0 deletions docs/user-guide/raw.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
.. _raw:

***************
Raw Binary Data
***************

Overview
========
This section document the format of the raw binary files.
Level 0 binary files are raw telemetry or command response packets generated by the instrument.


Photon Packet
-------------
The packet which provides individual hit data or photons for each detector.

+-------------+-----------+--------+------------------------------------------------------------+
| Name | Bit Size | Type | Description |
+=============+===========+========+============================================================+
| HEADER WORD | 16 | UINT | value is always 65131 |
+-------------+-----------+--------+------------------------------------------------------------+
| FLAGS | 16 | UNIT | various flags, To be defined |
+-------------+-----------+--------+------------------------------------------------------------+
| Packet Size | 16 | UINT | The size of the packet which can be used to determine the |
| | | | number of hits included in the packet |
+-------------+-----------+--------+------------------------------------------------------------+
| Time Stamp | 48 | UINT | |
+-------------+-----------+--------+------------------------------------------------------------+
| Checksum | 16 | UINT | For data integrity |
+-------------+-----------+--------+------------------------------------------------------------+
| Pixel data | | | This field is repeated based on the number of hits detected|
+-------------+-----------+--------+------------------------------------------------------------+


+-------------+-----------+--------+------------------------------------------------------------+
| Pixel Data | | | |
+-------------+-----------+--------+------------------------------------------------------------+
| Detector id | 16 | UINT | The detector id for the location of the hit |
+-------------+-----------+--------+------------------------------------------------------------+
| Hit energy | 16 | UINT | The ADC value for the energy of the hit |
+-------------+-----------+--------+------------------------------------------------------------+
14 changes: 11 additions & 3 deletions padre_meddea/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@
import os
from pathlib import Path

from astropy.time import Time

try:
from ._version import version as __version__
from ._version import version_tuple
Expand Down Expand Up @@ -31,9 +33,6 @@
_data_directory = _package_directory / "data"
_test_files_directory = _package_directory / "data" / "test"

MISSION_NAME = "PADRE"
INSTRUMENT_NAME = "MeDDEA"

# the ratio of detector area for large pixels versus small pixels
RATIO_TOTAL_LARGE_TO_SMALL_PIX = 0.947

Expand Down Expand Up @@ -63,4 +62,13 @@
10.73,
]

APID = {
"spectrum": 0xA2, # decimal 162
"photon": 0xA0, # decimal 160
"housekeeping": 0xA3, # decimal 163
"cmd_resp": 0x99, # decimal 153
}

EPOCH = Time("2000-01-01 00:00", scale="utc")

log.debug(f"padre_meddea version: {__version__}")
146 changes: 112 additions & 34 deletions padre_meddea/calibration/calibration.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,10 +14,15 @@

import padre_meddea
from padre_meddea import log
from padre_meddea.io import file_tools
from padre_meddea.io import file_tools, fits_tools

from padre_meddea.util.util import create_science_filename
from padre_meddea.util.util import create_science_filename, calc_time
from padre_meddea.io.file_tools import read_raw_file
from padre_meddea.io.fits_tools import (
add_process_info_to_header,
get_primary_header,
get_std_comment,
)

__all__ = [
"process_file",
Expand Down Expand Up @@ -45,25 +50,39 @@
# Check if the LAMBDA_ENVIRONMENT environment variable is set
lambda_environment = os.getenv("LAMBDA_ENVIRONMENT")
output_files = []
file_path = Path(filename)

if filename.suffix == ".bin":
parsed_data = read_raw_file(filename)
if file_path.suffix == ".bin":
parsed_data = read_raw_file(file_path)
if parsed_data["photons"] is not None: # we have event list data
ph_list = parsed_data["photons"]
hdu = fits.PrimaryHDU(data=None)
hdu.header["DATE"] = (Time.now().fits, "FITS file creation date in UTC")
fits_meta = read_fits_keyword_file(
padre_meddea._data_directory / "fits_keywords_primaryhdu.csv"
event_list, pkt_list = parsed_data["photons"]
primary_hdr = get_primary_header()
primary_hdr = add_process_info_to_header(primary_hdr)
primary_hdr["LEVEL"] = (0, get_std_comment("LEVEL"))
primary_hdr["DATATYPE"] = ("event_list", get_std_comment("DATATYPE"))
primary_hdr["ORIGAPID"] = (
padre_meddea.APID["photon"],
get_std_comment("ORIGAPID"),
)
for row in fits_meta:
hdu.header[row["keyword"]] = (row["value"], row["comment"])
bin_hdu = fits.BinTableHDU(data=Table(ph_list))
hdul = fits.HDUList([hdu, bin_hdu])
primary_hdr["ORIGFILE"] = (file_path.name, get_std_comment("ORIGFILE"))

for this_keyword in ["DATE-BEG", "DATE-END", "DATE-AVG"]:
primary_hdr[this_keyword] = (
event_list.meta.get(this_keyword, ""),
get_std_comment(this_keyword),
)

empty_primary_hdu = fits.PrimaryHDU(header=primary_hdr)
pkt_hdu = fits.BinTableHDU(pkt_list, name="PKT")
pkt_hdu.add_checksum()
hit_hdu = fits.BinTableHDU(event_list, name="SCI")
hit_hdu.add_checksum()
hdul = fits.HDUList([empty_primary_hdu, hit_hdu, pkt_hdu])

path = create_science_filename(
"meddea",
ph_list["time"][0].fits,
"l1",
time=primary_hdr["DATE-BEG"],
level="l1",
descriptor="eventlist",
test=True,
version="0.1.0",
Expand All @@ -77,21 +96,89 @@

# Write the file, with the overwrite option controlled by the environment variable
hdul.writeto(path, overwrite=overwrite)

# Store the output file path in a list
output_files = [path]
output_files.append(path)
if parsed_data["housekeeping"] is not None:
hk_data = parsed_data["housekeeping"]
hk_data.meta["INSTRUME"] = "meddea"

if "CHECKSUM" in hk_data.colnames:
hk_data.remove_column("CHECKSUM")

# send data to AWS Timestream for Grafana dashboard
record_timeseries(hk_data, "housekeeping")
hk_table = Table(hk_data)

Check warning on line 105 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L105

Added line #L105 was not covered by tests

primary_hdr = get_primary_header()
primary_hdr = add_process_info_to_header(primary_hdr)
primary_hdr["LEVEL"] = (0, get_std_comment("LEVEL"))
primary_hdr["DATATYPE"] = ("housekeeping", get_std_comment("DATATYPE"))
primary_hdr["ORIGAPID"] = (

Check warning on line 111 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L107-L111

Added lines #L107 - L111 were not covered by tests
padre_meddea.APID["housekeeping"],
get_std_comment("ORIGAPID"),
)
primary_hdr["ORIGFILE"] = (file_path.name, get_std_comment("ORIGFILE"))

Check warning on line 115 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L115

Added line #L115 was not covered by tests

date_beg = calc_time(hk_data["timestamp"][0])
primary_hdr["DATEREF"] = (date_beg.fits, get_std_comment("DATEREF"))

Check warning on line 118 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L117-L118

Added lines #L117 - L118 were not covered by tests

hk_table["seqcount"] = hk_table["CCSDS_SEQUENCE_COUNT"]
colnames_to_remove = [

Check warning on line 121 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L120-L121

Added lines #L120 - L121 were not covered by tests
"CCSDS_VERSION_NUMBER",
"CCSDS_PACKET_TYPE",
"CCSDS_SECONDARY_FLAG",
"CCSDS_SEQUENCE_FLAG",
"CCSDS_APID",
"CCSDS_SEQUENCE_COUNT",
"CCSDS_PACKET_LENGTH",
"CHECKSUM",
"time",
]
for this_col in colnames_to_remove:
if this_col in hk_table.colnames:
hk_table.remove_column(this_col)

Check warning on line 134 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L132-L134

Added lines #L132 - L134 were not covered by tests

empty_primary_hdu = fits.PrimaryHDU(header=primary_hdr)
hk_hdu = fits.BinTableHDU(data=hk_table, name="HK")
hk_hdu.add_checksum()

Check warning on line 138 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L136-L138

Added lines #L136 - L138 were not covered by tests

# add command response data if it exists
if parsed_data["cmd_resp"] is not None:
data_ts = parsed_data["cmd_resp"]
this_header = fits.Header()
this_header["DATEREF"] = (

Check warning on line 144 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L141-L144

Added lines #L141 - L144 were not covered by tests
data_ts.time[0].fits,
get_std_comment("DATEREF"),
)
record_timeseries(data_ts, "housekeeping")
data_table = Table(data_ts)
colnames_to_remove = [

Check warning on line 150 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L148-L150

Added lines #L148 - L150 were not covered by tests
"CCSDS_VERSION_NUMBER",
"CCSDS_PACKET_TYPE",
"CCSDS_SECONDARY_FLAG",
"CCSDS_SEQUENCE_FLAG",
"CCSDS_APID",
"CCSDS_SEQUENCE_COUNT",
"CCSDS_PACKET_LENGTH",
"CHECKSUM",
"time",
]
for this_col in colnames_to_remove:
if this_col in hk_table.colnames:
data_table.remove_column(this_col)
cmd_hdu = fits.BinTableHDU(data=data_table, name="READ")
cmd_hdu.add_checksum()

Check warning on line 165 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L161-L165

Added lines #L161 - L165 were not covered by tests
else: # if None still end an empty Binary Table
this_header = fits.Header()
cmd_hdu = fits.BinTableHDU(data=None, header=this_header, name="READ")
hdul = fits.HDUList([empty_primary_hdu, hk_hdu, cmd_hdu])

Check warning on line 169 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L167-L169

Added lines #L167 - L169 were not covered by tests

# calibrated_file = calibrate_file(data_filename)
# data_plot_files = plot_file(data_filename)
# calib_plot_files = plot_file(calibrated_file)
path = create_science_filename(

Check warning on line 171 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L171

Added line #L171 was not covered by tests
time=date_beg,
level="l1",
descriptor="hk",
test=True,
version="0.1.0",
)
hdul.writeto(path, overwrite=overwrite)
output_files.append(path)

Check warning on line 179 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L178-L179

Added lines #L178 - L179 were not covered by tests
if parsed_data["spectra"] is not None:
spec_data = parsed_data["spectra"]

Check warning on line 181 in padre_meddea/calibration/calibration.py

View check run for this annotation

Codecov / codecov/patch

padre_meddea/calibration/calibration.py#L181

Added line #L181 was not covered by tests

# add other tasks below
return output_files
Expand Down Expand Up @@ -146,12 +233,3 @@
# if can't read the file

return None


def read_fits_keyword_file(csv_file: Path):
"""Read csv file with default fits metadata information."""
fits_meta_table = ascii.read(
padre_meddea._data_directory / "fits_keywords_primaryhdu.csv",
format="csv",
)
return fits_meta_table
2 changes: 2 additions & 0 deletions padre_meddea/data/README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -39,3 +39,5 @@ Stores detector constants.
hk_channel_defs.csv
-------------------
Stores the definitions for the values provided in housekeeping packets.

fits_
2 changes: 1 addition & 1 deletion padre_meddea/data/calibration/README.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
Calbiration directory
Calibration directory
=====================

This directory contains calibration files included with the package source
Expand Down
Loading
Loading