I have a serial receive module, using freeRTOS on arduino. When I receive my data packet, it’s exactly half the size I was expecting.
Here’s my code:
void SerialReceiveTask(void* pvParameters){
(void) pvParameters;
_read_buffer = 0; // Start in the first buffer
_decode_buffer = 1; // Initialise the decode buffer to be the opposite.
xSerialDataSemaphore = xSemaphoreCreateBinary();
uint16_t num_bytes;
Serial1.begin(38400);
Serial1.setTimeout(DATA_TIMEOUT);
DEBUG_PRINTLN("Entering Serial Receive Main loop");
while(1){
// Zero the important bits of the buffer so decoding becomes easier:
_buffer[_read_buffer][0] = 0;
_buffer[_read_buffer][1] = 0;
_buffer[_read_buffer][2] = 0;
_buffer[_read_buffer][3] = 0;
_buffer[_read_buffer][PACKET_SIZE-1] = 0;
// Waiting for data:
while (!Serial1.available()){
vTaskDelay(pdMS_TO_TICKS(5));
}
// Got data, so read it into the buffer:
num_bytes = Serial1.readBytes(_buffer[_read_buffer], PACKET_SIZE);
DEBUG_PRINT("Received data bytes: ");
DEBUG_PRINTLN(num_bytes);
bufferswap(_decode_buffer);
bufferswap(_read_buffer);
xSemaphoreGive(xSerialDataSemaphore);
// Now we wait for the timeout period again:
vTaskDelay(pdMS_TO_TICKS(READ_DELAY));
}
}
My data packets are 1029 bytes long, but the data received only indicates 515 bytes received. The timeout I’ve set is enough for the data to be received. the timeout is 250ms, while the data takes about 213ms to arrive. The packets arrive every second, which is the frequency at which I’m sending the semaphore to the decode task.
Is there a bug in the Serial.readBytes() function? Is it reading in uint16_t instead of uint8_t?
In which case maybe I should be reading in uint16_t’s instead??
I can successfully decode the data using a separate python program reading from my PC on the same lines simultaneously, so i know the data is coming in correctly.