sift_py.data_import.ch10
¶
| CLASS | DESCRIPTION |
|---|---|
BaseCh10File |
Base class for uploading IRIG Chapter 10/Chapter 11 files. |
Ch10UploadService |
Service to upload ch10 files. |
BaseCh10File
¶
Base class for uploading IRIG Chapter 10/Chapter 11 files.
Implement a concrete version of this class that parses a ch10 stream and returns a csv row of data on each iteration.
Set gzip to True if sending a compressed stream.
Example:
class Ch10(BaseCh10File):
def __init__(self, path):
self.file = open(path, "rb")
self.initialize_csv_data_columns = None
def initialize_csv_data_columns(self):
self.csv_config_data_columns = self.process_ch10_computer_f1_packet()
def process_ch10_computer_f1_packet(self) -> Dict[int, dict]:
# Processes the first Computer F1 packet
# and returns the measurements as a dict.
...
def process_ch10_pcm_packet(self) -> str:
# Processed the data packets and returns
# a CSV row.
...
def __next__(self) -> str:
# On all iterations, return data for the CSV file.
if end_of_file:
raise StopIteration()
else:
return self.process_ch10_data_packet()
| METHOD | DESCRIPTION |
|---|---|
__iter__ |
|
__next__ |
|
initialize_csv_data_columns |
Must populate the |
| ATTRIBUTE | DESCRIPTION |
|---|---|
csv_config_data_columns |
TYPE:
|
gzip |
TYPE:
|
initialize_csv_data_columns
¶
Must populate the csv_config_data_columns attribute
that is the data_columns entry in the CsvConfig.
See the Sift data_import module or API docs for the schema.
Ch10UploadService
¶
Ch10UploadService(rest_conf: SiftRestConfig)
Bases: CsvUploadService
Service to upload ch10 files.
| METHOD | DESCRIPTION |
|---|---|
simple_upload |
Uploads the CSV file pointed to by |
upload |
Uploads the ch10 file to the specified asset. |
upload_from_url |
Uploads the CSV file pointed to by |
| ATTRIBUTE | DESCRIPTION |
|---|---|
RUN_PATH |
|
UPLOAD_PATH |
|
URL_PATH |
|
simple_upload
¶
simple_upload(
asset_name: str,
path: Union[str, Path],
first_data_row: int = 2,
time_column: int = 1,
time_format: TimeFormatType = ABSOLUTE_DATETIME,
run_name: Optional[str] = None,
run_id: Optional[str] = None,
units_row: Optional[int] = None,
descriptions_row: Optional[int] = None,
relative_start_time: Optional[str] = None,
) -> DataImportService
Uploads the CSV file pointed to by path to the specified asset. This function will
infer the data types and assume certain things about how the data is formatted. See the options
below for what parameters can be overridden. Use upload if you need to specify a custom CSV config.
Override first_data_row to specify which is the first row with data. Default is 2.
Override time_column to specify which column contains timestamp information. Default is 1.
Override time_format to specify the time data format. Default is TimeFormatType.ABSOLUTE_DATETIME.
Override run_name to specify the name of the run to create for this data. Default is None.
Override run_id to specify the id of the run to add this data to. Default is None.
Override units_row to specify which row contains unit information. Default is None.
Override descriptions_row to specify which row contains channel description information. Default is None.
Override relative_start_time if a relative time format is used. Default is None.
upload
¶
upload(
ch10_file: BaseCh10File,
asset_name: str,
time_format: TimeFormatType = ABSOLUTE_UNIX_NANOSECONDS,
run_name: Optional[str] = None,
run_id: Optional[str] = None,
) -> DataImportService
Uploads the ch10 file to the specified asset.
Override time_format to specify the time data format. Default is TimeFormatType.ABSOLUTE_UNIX_NANOSECONDS.
Override run_name to specify the name of the run to create for this data. Default is None.
Override run_id to specify the id of the run to add this data to. Default is None.
upload_from_url
¶
upload_from_url(
url: str, csv_config: CsvConfig
) -> DataImportService
Uploads the CSV file pointed to by url using a custom CSV config.