influxdb-client-python
influxdb-client-python copied to clipboard
The specified timestamp does not take effect
Specifications
- Client Version:1.37.0
- InfluxDB Version: InfluxDB OSS 2.7.1
- Platform: CentOS 7.9 x86_64 Hi all, I have the last 15min performance data with the timestamp. When I write the data to influxdb, it only writes the last point with the current timestamp. My data looks like this:
metrics = [
{
'measurement': 'perf',
'tags': {
'device': 'device1',
'obj': 'obj1'
},
'fields': {
'ops': 123
},
'timestamp': 1690266060
},
{
'measurement': 'perf',
'tags': {
'device': 'device1',
'obj': 'obj1'
},
'fields': {
'ops': 123
},
'timestamp': 1690266160
},
{
'measurement': 'perf',
'tags': {
'device': 'device2',
'obj': 'obj2'
},
'fields': {
'ops2': 456
},
'timestamp': 1690266160
},
# ...balabala... I have many devices like device1/device2/devicex. Each device has many obj like obj1/obj2/objx
# the same obj has many points of the last 15min. the timestamp is the in the last 15min, not the current time. One point per minute for the same field of the same tag.
]
The problem is that when I write to the influxdb, it only keep the last value instead of 15 values and the timestamp is changed to the time when the script run.
Code sample to reproduce problem
def send_metric(metrics):
with InfluxDBClient(url=myurl, token=mytoken, org=myorg) as _client:
with _client.write_api() as _write_client:
_write_client.write(bucket, org, metrics, write_precision=WritePrecision.S)
send_metric(metrics)
Expected behavior
As expected, for example, now it's 17:30. when I run send_metric(), the influxdb should add many points with the timestamp specificed in the metrics list (it may be 17:14~17:29 if I get the metric data at 17:29).
for the tag 'device.obj1', it should have 15 values for the field 'ops'.
Actual behavior
The influxdb only add one point of the last value for the tag 'device.obj1', and the timestamp is changed to 17:30(the time when I run send_metric).
Additional info
I try to write one point to test.
metrics = [
{
'measurement': 'perf',
'tags': {
'device': 'device1',
'obj': 'obj1'
},
'fields': {
'ops': 123
},
'timestamp': 1690266060
}
]
The timestamp 1690266060 is the local time Jul 25th 14:21:00. Then I run send_metric at 14:30, then I can see the point is written in influxdb but the timestamp is 14:30. I also tried to change the timestamp to ms but still the same.
Hi @wolfohyeah,
thanks for using our client.
The following data points are same, only last one will be "save" in InfluxDB:
{
'measurement': 'perf',
'tags': {
'device': 'device1',
'obj': 'obj1'
},
'fields': {
'ops': 123
},
'timestamp': 1690266060
},
{
'measurement': 'perf',
'tags': {
'device': 'device1',
'obj': 'obj1'
},
'fields': {
'ops': 123
},
'timestamp': 1690266160
},
}
For more info see https://docs.influxdata.com/influxdb/cloud/reference/key-concepts/data-elements/#series
Regards
@bednar Thank you for reply. But the to points are not the same. you can see the timestamp is different. One is 14:21 and the other is 14:22 (local time).
You are right, sorry for misunderstood.
The name of timestamp field should be: 'time' or you can override name settings by:
def send_metric(metrics):
with InfluxDBClient(url=myurl, token=mytoken, org=myorg) as _client:
with _client.write_api() as _write_client:
_write_client.write(bucket, org, metrics, write_precision=WritePrecision.S, record_time_key="timestamp")
send_metric(metrics)
For more info see https://influxdb-client.readthedocs.io/en/stable/api.html#writeapi
Regards
The name of timestamp field should be: 'time' or you can override name settings by:
def send_metric(metrics): with InfluxDBClient(url=myurl, token=mytoken, org=myorg) as _client: with _client.write_api() as _write_client: _write_client.write(bucket, org, metrics, write_precision=WritePrecision.S, record_time_key="timestamp") send_metric(metrics)For more info see https://influxdb-client.readthedocs.io/en/stable/api.html#writeapi
Regards
Hi @bednar ,
I tried 'time' before I opened this issue.
Step 1. Use 'timestamp' without record_time_key="timestamp" : only one point saved but the timestamp is changed to the current time.
Step 2. Use 'time' without record_time_key : no data been saved.
Step 3. After your suggestion, I use 'timestamp' and record_time_key="timestamp" : no data been saved..
Step 4. Use 'time' and record_time_key="time" : no data been saved.
Maybe the problem is in my data structure ? My metrics is a list of dict .
Update: I used Point.from_dict() and still failed.
def make_point_metric(metrics):
points = []
for i in metrics:
# the record_time_key is the same with in the metrics list. I tried both default 'time' and 'timestamp'
p = Point.from_dict(i, write_precision=WritePrecision.S, record_time_key='timestamp')
points.append(p)
return points
points = make_point_metric(metrics)
send_metric(points)
The result is no data saved.
Update again:
_write_client.write(mybucket, myorg, 'perf,device=device1,obj=obj1 ops=123i', write_precision=WritePrecision.S)
result: without timestamp specified, the data is saved as current time
_write_client.write(mybucket, myorg, 'perf,device=device1,obj=obj1 ops=456i 1690266160', write_precision=WritePrecision.S)
result: with timestamp specified, no data saved.
result: without timestamp specified, the data is saved as current time _write_client.write(mybucket, myorg, 'perf,device=device1,obj=obj1 ops=456i 1690266160', write_precision=WritePrecision.S) result: with timestamp specified, no data saved.
What query are you using to know that no data was saved? Do you get an error return code?
@bednar can correct me if I am wrong, but I believe write precision says the precision to write the data at. It does not set the precision that your data comes in as.
1690266160
The timestamp parameter is assumed to be a nanosecond timestamp per the line protocol spec. As such this timestamp is interpreted not as July 25, 2023, but as 1690266160 nanoseconds since Jan 1, 1970. To make matters worse, you are then setting the precision as seconds, which then lobs off 9 digits resulting in a timestamp of 1 second.
@powersj you are right, the write_precision means how are the data written, not queried.
@wolfohyeah I've created a working example to demonstrate how you can write and query data using the InfluxDB Python client. This script uses the WritePrecision setting to specify how data timestamps should be handled when writing to the database.
Here's a fully functional example for your use case:
from influxdb_client import WritePrecision, InfluxDBClient
from influxdb_client.client.write_api import SYNCHRONOUS
url = "http://localhost:8086"
token = "my-token"
bucket = "my-bucket"
organization = "my-org"
metrics = [
{
'measurement': 'perf',
'tags': {
'device': 'device1',
'obj': 'obj1'
},
'fields': {
'ops': 123
},
'timestamp': 1690266060
},
{
'measurement': 'perf',
'tags': {
'device': 'device1',
'obj': 'obj1'
},
'fields': {
'ops': 123
},
'timestamp': 1690266160
},
{
'measurement': 'perf',
'tags': {
'device': 'device2',
'obj': 'obj2'
},
'fields': {
'ops2': 456
},
'timestamp': 1690266160
}
]
with InfluxDBClient(url=url, token=token, org=organization, debug=False) as client:
#
# Write data
#
write_api = client.write_api(write_options=SYNCHRONOUS)
write_api.write(bucket=bucket, record=metrics, write_precision=WritePrecision.S, record_time_key='timestamp')
# 1690266060 timestamp means Tuesday, 25 July 2023 06:21:00 in GMT
#
# Query data
#
query_api = client.query_api()
result = query_api.query(f'from(bucket:"{bucket}") |> range(start: 2023-01-01T00:00:00Z) |> filter(fn: (r) => r["_measurement"] == "perf")')
for table in result:
for record in table.records:
print(f'{record.values["_time"]}: device={record.values["device"]},obj={record.values["obj"]} '
f'{record.values["_field"]}={record.values["_value"]}')
Results:
2023-07-25 06:21:00+00:00: device=device1,obj=obj1 ops=123
2023-07-25 06:22:40+00:00: device=device1,obj=obj1 ops=123
2023-07-25 06:22:40+00:00: device=device2,obj=obj2 ops2=456
Please, let me know if this works for you or if there are any other aspects of your data handling that need adjusting.
Best Regards
I'm going to close this, but let us know if there are any outstanding questions.