Is it suitable for delay in usart transmit ?
https://github.com/nimaltd/atc/blob/3d673c5d9c24748bbfeb49c53c38c574c0a735fa/atc.c#L74
If the serial open with 115200 8N1, it will transmit a bit in
1/115200 = 0.000086805555555555555555555555555556 s
one frame take
11 * 1 / 115200 = 0.00095486111111111111111111 s
1 second is just enough.
But if we open the serial with higher baudrate, an one second delay may trigger a IDLE interrupt (if the receiver is a STM32).
And if we use RTOS, maybe some other high priority task will take the control and delay the transmit task.
Or maybe i am wrong (Im a still a noob in embedded development)
void atc_transmit(atc_t *atc, uint8_t *data, uint16_t len)
{
for (uint16_t i = 0; i < len; i++)
{
while (!LL_USART_IsActiveFlag_TXE(atc->usart))
;
LL_USART_TransmitData8(atc->usart, data[i]);
}
while (!LL_USART_IsActiveFlag_TC(atc->usart))
;
}
I dont underestand your problem :(
Sorry for my bad English :(
I think it's wrong for putting such a huge delay in USART transmission, It's OK for lower baudrate. For example, if we use a higher baudrate and the AT device is a STM32 device (as modem, some self-made device), the AT device will trigger an IDLE interrupt.
What I mean is a 1s delay is too long for higher baudrate device, and if we use RTOS, USART transmission task may be suspend.
why do you want to use delay? atc_transmit does not need delay.
why do you want to use delay? atc_transmit does not need delay.
I mean your code in atc.c line 69 to 79.
void atc_transmit(atc_t *atc, uint8_t *data, uint16_t len)
{
for (uint16_t i = 0; i < len; i++)
{
while (!LL_USART_IsActiveFlag_TXE(atc->usart))
atc_delay(1); ///< here
LL_USART_TransmitData8(atc->usart, data[i]);
}
while (!LL_USART_IsActiveFlag_TC(atc->usart))
atc_delay(1); ///< here
}
Oh. I see. I can not remember. But i had a problem with freertos and solved via delay.