Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Long time data logging in small files #513

Open
Anjali8340 opened this issue Feb 15, 2025 · 3 comments
Open

Long time data logging in small files #513

Anjali8340 opened this issue Feb 15, 2025 · 3 comments

Comments

@Anjali8340
Copy link

Hello,
i am using Teensy4.1 to log data into sd card. i am logging sample of data in every millisec, each sample is of 16 bytes. When file size exceed file size limit which is 16MB i create a new file and close the previous one. i am using ring buffer to store data and write to sd card only when 512 bytes are collected.

Below function is called every whenever data logging is required
void buff_fill(uint32_t t)
{
size_t n = rb.bytesUsed();

rb.write(startBytes, NUM_START_BYTES); // 3 bytes
rb.write(&t, NUM_T_BYTES); //4 bytes
rb.write(currentState); // 1 bytes
rb.write(&Trig_count, NUM_COUNT_BYTES); //6bytes
rb.write(&DAC_value, NUM_DAC_BYTES); //2bytes
if (n >= SECTOR_SIZE && !file.isBusy())
{
rb.writeOut(SECTOR_SIZE);
if (file.curPosition() >= MAX_FILE_SIZE)
{
createNewFile();
}
}

bool createNewFile()
{
// Close the current file if it's open
if (file.isOpen())
{
// if (!rb.sync())
// return false;
// rb.sync();
// file.truncate();
file.sync(); // Ensure data is written before closing
file.close();
}
getFileName(fc);
file.open(fileName, O_CREAT | O_WRONLY);
file.preAllocate(MAX_FILE_SIZE);

return true;
}

Problem is whenever a new file is created a huge delay is observed that block my code for more than minimum 5 millisec and max 50 millisec and some samples are lost. Logging is interrupt based . Whenever an interrupt occur a flag is set and i call this buff_fill function to write data. Interrupt occur every millisec so i cannot offer a code block of more than 900 microsec.

What should i do? Can you suggest me a better way to approach this issue

@greiman
Copy link
Owner

greiman commented Feb 15, 2025

You could create a series of per-allocated files. Creating a file and allocating space on an SD can take an arbitrary amount of time since there is no free list for space. A linear search is done for directory entries and file data space.

@hattesen
Copy link

hattesen commented Feb 17, 2025

@greiman I assume you meant "pre-allocated";)

Can you provide a link to example code for pre-allocating file space on an SD-card?

@greiman
Copy link
Owner

greiman commented Feb 17, 2025

Sorry about the typo it is pre-allocate.

First open an empty file like this. Then allocate a block of contiguous space like this.

For best performance a contiguous file write is important. 512 bytes sectors are emulated in very large flash pages. Record Units, RUs, can be as large as 512KB on modern cards.

I have used an array of file objects when the size of each file must be limited. That way you don't need to close and open files while logging data.

FsFile files[NFILE];

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants