Time Series Storage Service

Write, record, and query device data in our Timeseries database optimized for IoT data.

TSDB (time series database) service provides a storage layer dedicated for metric data produced and evolving over time. For example, TSDB can be used to store periodic sensor data sent from your devices.

Data is associated with a timestamp and optional tags which can be used later for query purposes. Finally, the TSDB service provides a robust aggregation system enabling data-mining capabilities.

Operations

Storage

Events

Operations

Storage


deleteAll

delete all data of a given solution

Responses

All data deleted

Content: nil

Error

Content: object The response to the caller

Name Type Description
error string Error Message in case of failure
result object Result message

Example

-- Delete all data of a given solution
local out = Tsdb.deleteAll()
response.message = out


export

Start an export job.
An event of TSDB export will be triggered once the job finished in any state.
You can define a 'tsdb' event handler in Lua for this 'exportJob' event in your solution.

Arguments

parameters - object - Object containing service call parameters.
Name Type Description
query object Only normal query arguments are allowed, which means sampling_size and aggregate arguments will be rejected.
Difference default value of parameters from query API:
1. If limit argument is not specified, it assumes to export all resulting data. It's not limited to 10000.
2. If start_time/relative_start is not specified, it assumes to export from the oldest data.
3. The utc of epoch is not supported. Timestamp is always returned in numeric format.
format object The data format rules. Property name should be the field name which are the metrics name, "timestamp" and "tags".
format.^[a-zA-Z0-9_]+$ [ object ] Functions to format the this field. the rules will be applied by its order in array.
format.^[a-zA-Z0-9_]+$[].label string Append the given string to field value. Metrics field only.
format.^[a-zA-Z0-9_]+$[].round integer {..15} Round the field value with given value. Metrics field only.
format.^[a-zA-Z0-9_]+$[].rename string Rename a field to the given value. Metrics field only.
format.^[a-zA-Z0-9_]+$[].discard boolean Remove the field when the value is true. The field parameter should be "tags" and only support "tags" field.
format.^[a-zA-Z0-9_]+$[].replace object Replace the field value which matching the pattern to the new value. Metrics field only.
format.^[a-zA-Z0-9_]+$[].replace.to string The replacement value. Using \{n} to specify capture group. {n} is number of group.
format.^[a-zA-Z0-9_]+$[].replace.match string String or regular expression.
format.^[a-zA-Z0-9_]+$[].datetime integer Convert the unix timestamp to human readable format(support ISO 8601: yyyy-mm-ddThh:mm:ss.[mmm]) for given value as UTC offset hours(n or -n).
For example the timestamp 1509437405123 will be converted to 2017-10-31T08:10:05.123Z when value is 0. Timestamp and metrics field only.
maximum: 14
minimum: -12
format.^[a-zA-Z0-9_]+$[].normalize [ string ] {..20} Normalize the given list of tag names, it will fill into different columns. The tags are not specified will be dropped.
The field parameter should be "tags" and only support "tags" field.
filename string File name of export CSV file. The ".csv" extension is not required inside the name. (Space is not allowed in filename)

Responses

Job successfully started

Content: object The information of job ID

Name Type Description
job_id string Job ID
Error

Content: object The response to the caller

Name Type Description
error string Error Message in case of failure
result object Result message

Example

-- Query constraints for export
local metrics = {
  "temperature",
  "humidity",
  "switch",
  "host"
}
local tags = {
  region = "us",
  city = "minneapolis"
}
local query = {
  metrics = metrics,
  tags = tags
}
local format = {
  temperature = {
    {round=3},
    {rename="Temp"}
  },
  timestamp = {
    {datetime=-5}
  },
  tags = {
    {normalize={"city", "region"}}
  },
  host = {
    {replace={
      match="/(\\d{1,3}\\.\\d{1,3}\\.\\d{1,3}\\.\\d{1,3}):(\\d{1,5})/",
      to="Port: \\2 and IP: \\1"
    }}
  }
} -- Start a new export job
local job_id = Tsdb.export({
  query = query,
  filename = "export_mlps_20170321",
  format = format
})
response.message = job_id


exportJobInfo

Query the information of an export job, including status.

Arguments

parameters - object - Object containing service call parameters.
Name Type Description
job_id ^[a-zA-Z0-9]+$ Job id

Responses

Job info successfully returned

Content: object The information for export job

Name Type Description
error string Error message if job failed
query object Query arguments
state string State of the job (enqueued, expired, in-progress, completed or failed)
format object The data format rules. Property name should be the field name which are the metrics name, "timestamp" and "tags".
format.^[a-zA-Z0-9_]+$ [ object ] Functions to format the this field. the rules will be applied by its order in array.
format.^[a-zA-Z0-9_]+$[].label string Append the given string to field value. Metrics field only.
format.^[a-zA-Z0-9_]+$[].round integer {..15} Round the field value with given value. Metrics field only.
format.^[a-zA-Z0-9_]+$[].rename string Rename a field to the given value. Metrics field only.
format.^[a-zA-Z0-9_]+$[].discard boolean Remove the field when the value is true. The field parameter should be "tags" and only support "tags" field.
format.^[a-zA-Z0-9_]+$[].replace object Replace the field value which matching the pattern to the new value. Metrics field only.
format.^[a-zA-Z0-9_]+$[].replace.to string The replacement value. Using \{n} to specify capture group. {n} is number of group.
format.^[a-zA-Z0-9_]+$[].replace.match string String or regular expression.
format.^[a-zA-Z0-9_]+$[].datetime integer Convert the unix timestamp to human readable format(support ISO 8601: yyyy-mm-ddThh:mm:ss.[mmm]) for given value as UTC offset hours(n or -n).
For example the timestamp 1509437405123 will be converted to 2017-10-31T08:10:05.123Z when value is 0. Timestamp and metrics field only.
maximum: 14
minimum: -12
format.^[a-zA-Z0-9_]+$[].normalize [ string ] {..20} Normalize the given list of tag names, it will fill into different columns. The tags are not specified will be dropped.
The field parameter should be "tags" and only support "tags" field.
job_id string Job ID
length string The total length of export file in bytes
filename string File name of the exported CSV file
content_id string Content ID of the job to Content service
context_id string Solution id
start_time string Start time of the job
update_time string Last updated time of the job
Error

Content: object The response to the caller

Name Type Description
error string Error Message in case of failure
result object Result message

Example

local job_info = Tsdb.exportJobInfo({
  job_id = "xxyyzz"
})
response.message = job_info


exportJobList

List export job records of a given solution in descending timestamp order

Arguments

parameters - object - Object containing service call parameters.
Name Type Description
limit integer Limit the number of results to return (default: 100, maximum allowed: 1000)

Responses

List of export job

Content: [ object ]

Name Type Description
state string State of the job (enqueued, expired, in-progress, completed or failed)
job_id string Job ID
start_time string Start time of the job
Error

Content: object The response to the caller

Name Type Description
error string Error Message in case of failure
result object Result message

Example

local job_list = Tsdb.exportJobList()
response.message = job_list


import

Start an import job with a header row defined in the CSV file
Header line contains many column definitions.
A column is defined by column_name|column_type|data_type
More details are given below.
Here are column types used to annotate a column:

  • t means tag
  • m means metric
  • ts means timestamp
  • mn means metric name in a pair
  • mv means metric value in a pair
    Here are data types supported by different columns:
  • str: tag, metric, metric name, metric value
  • int: metric, metric value
  • sec: timestamp in second as integer
  • ms: timestamp in millisecond as integer
  • us: timestamp in microsecond as integer
  • float: metric, metric value
    Combined the two, you can start to write your column definitions in the
    first line of CSV. The formula is:
    header line := column_def_1,column_def_2,column_def_3,....,column_def_n
    column_def := column_name|column_type|data_type
    The default data type for some kind of columns are listed below. For those columns,
    the data type can be omitted in it's column definition.
  • timestamp: sec
  • tag: str
  • metric name: str
    Finally, there is an living example of how to use this kind of annotation to represent
    a data set in csv format.
    Given the CSV file:
    timestamp|ts,weather|m|str,temperature|m|float,city|t,max_or_min_pair|mn,max_or_min_pair|mv|float
    12345,cold,15.4,Taipei,lowest,12.4
    12344,warm,23.7,Tainan,highest,25.3
    That will be transformed to two write to Tsdb.write:
    Tsdb.write({
    metrics = {
    weather = "cold",
    temperature = 15.4,
    lowest = 12.4
    }},
    tags = {
    city = "Taipei"
    }},
    ts = "12345000000"
    })
    Tsdb.write({
    metrics = {
    weather = "warm",
    temperature = 23.7,
    highest = 25.3
    }},
    tags = {
    city = "Tainan"
    }},
    ts = "12344000000"
    })

Arguments

parameters - object - Object containing service call parameters.
Name Type Description
url string The url for a CSV file

Responses

Job successfully started

Content: nil

Error

Content: object The response to the caller

Name Type Description
error string Error Message in case of failure
result object Result message

Example

local job_id = Tsdb.import({
  url = "http://example.com/a_sample_file"
})
response.message = job_id


importJobInfo

query the status of an import job

Arguments

parameters - object - Object containing service call parameters.
Name Type Description
job_id ^[a-zA-Z0-9]+$ Job id

Responses

Job info successfully returned

Content: nil

Error

Content: object The response to the caller

Name Type Description
error string Error Message in case of failure
result object Result message

Example

local job_info = Tsdb.importJobInfo({
  job_id = "2345"
})
response.message = job_id


importJobList

list all job ids started by a given solution id

Responses

Job successfully listed

Content: nil

Error

Content: object The response to the caller

Name Type Description
error string Error Message in case of failure
result object Result message

Example

local job_list = Tsdb.importJobList()
response.message = job_list


listMetrics

list metrics of a given solution

Arguments

parameters - object - Object containing service call parameters.
Name Type Description
limit integer {..1000} Limit the number of rows to return.
Default: 1000
next string Optional, the cursor to get next page if still having more data

Responses

Metrics information retrieved

Content: nil

Error

Content: object The response to the caller

Name Type Description
error string Error Message in case of failure
result object Result message

Example

-- Get a list of created metrics for a given solution
local out = Tsdb.listMetrics({limit = 10})
response.message = out

-- Use next cursor to fetch next page if found in the result of previous query
local out = Tsdb.listMetrics({limit = 10, next = "switch_2"})
response.message = out


listTags

list tags of a given solution

Arguments

parameters - object - Object containing service call parameters.
Name Type Description
limit integer {..1000} Limit the number of rows to return.
Default: 1000
next string Optional, the cursor to get next page if still having more data

Responses

Tags information retrieved

Content: nil

Error

Content: object The response to the caller

Name Type Description
error string Error Message in case of failure
result object Result message

Example

-- Get a list of created tags for a given solution
local out = Tsdb.listTags({limit = 10})
response.message = out

-- Use next cursor to fetch next page if found in the result of previous query
local out = Tsdb.listTags({limit = 10, next = "city_taipei_1_1_true"})
response.message = out


multiWrite

Write data points to one or many metrics with an optional set of tags and a timestamp down to microsecond precision.

Note that if multiple data points are written with exactly the same timestamp, only the last one will be kept and it overwrites the others.

Each metric value has a limited size which depends on the number of tags. (Number of tags + 1) multiplies the size of metric value can't over 480KB. A write request will be rejected without partial writes if exceeding the limit. If any of data points are invalid, request will be rejected without partial writes.

To prevent from a synchronized service call take too long time to response, there are some limitations. Total number of data entries in multiple write at most 2,000 (Reference to the limit of single write). Number of data entries per datapoint: "Number of metrics * (Number of tag pairs) + 1"

If succeeds, it turns a list string of write timestamp in microseconds.

Arguments

parameters - object - Object containing service call parameters.
Name Type Description
return_ts boolean Whether to return write timestamp in the response
datapoints [ object ] List of data points
datapoints[].ts integer, string Unix timestamp in microseconds used as the write time for given data point.
Supported units: u (microseconds), ms (milliseconds), s (seconds)
e.g. 1472547546000000u, 1472547546000ms, 1472547546s, 1472547546
Optional, if not provided, it will use the received time in microseconds from server side
datapoints[].tags object Pairs of tag and its tag value (only text supported).
Maximum number of tags in a single write: 20
datapoints[].metrics object Pairs of metric name and its value.
Maximum number of metrics in a single write: 100

Responses

Data successfully inserted

Content: [ object ] List of result of each data point.

Name Type Description
write_timestamp string The timestamp of data point written to TSDB (in microseconds)
Data successfully inserted

Content: nil

Error

Content: object The response to the caller

Name Type Description
error string Error Message in case of failure
result object Result message

Example

-- Write multiple datapoints of metrics with tags
-- If timestamp is not provided, it will use the received time in microseconds from server side
local metrics1 = {
  temperature = 37.2,
  humidity = 73,
  switch = "on"
}
local metrics2 = {
  temperature = 31.2,
  humidity = 55,
  switch = "off"
}
local tags1 = {
  pid = "pzomp8vn4twklnmi",
  identity = "000001",
  region = "us",
  city = "minneapolis"
}
local tags2 = {
  pid = "lvwpoj19hp7k0000",
  identity = "000002",
  region = "tw",
  city = "taipei"
}
local out = Tsdb.multiWrite({
  datapoints = {
    {
      metrics = metrics1,
      tags = tags1
    },
    {
      metrics = metrics2,
      tags = tags2
    }
  },
  return_ts = true
})
response.message = out


query

Query data points by using any metrics and tags. Support absolute (start_time, end_time) or relative (relative_start, relative_unit) time parameters, which cannot be used at the same time.
The first element in returned data point array is always the timestamp.
You can use fill argument to indicate the imputation of missing values.
The metric names of columns property in the response will always be in the order specified in the query, except for the timestamp column which is always the first one.

If no time constraints are specified, it will return recent data points up to the maximum limit in recent one week.

Note that only unique timestamped data will be returned, which means if multiple data points were written with exactly the same timestamp in the response, only the last one will be kept. The logic OR operator is only supported in basic query and cannot be used in down-sampling or aggregation queries.

Arguments

parameters - object - Object containing service call parameters.
Name Type Description
fill integer, string Value to fill for time slots with no data points exist.
If the query without sampling, it only works in merge mode.
Supported fill types:
- "": fill the slot with empty string "".
- "null": fill the slot with JSON null.
- "previous": fill the slot with previous value.
- "none": fill the slot with "none" string.
- any integer: fill the slot with the specified integer value.
- "~s:CUSTOM_STRING": fill the slot with the specified "CUSTOM_STRING" string.
Default: none
mode string Indicate whether to merge or split the result of each metric.
Supported options: merge, split
Default: merge
tags object One or many tags.
Maximum number of tag pairs: 20
If tag value is string, it applies to “AND” operator with other tag pairs.
Following is example of operator (type=switch and area=US)
{
"tags": {
"type": "sensor",
"area": "US"
}
}
If the tag value is array, it applies to “OR” operator. All tag values with array type are in the same group of “OR” operator, even if the tag name is different. When OR operator appears in query, the response structure will be grouped by tags of OR operator.
Maximum number of "OR" tag values: 100
Following is example of operator (type=switch or type=sensor or area=US or area=TW), it counts the "OR" tag values as 4.
{
"tags": {
"type": ["switch", "sensor"],
"area": ["US", "TW"]
}
}
epoch string Change returned timestamp of data points to unix epoch format.
Supported units: u (microseconds), ms (milliseconds), s (seconds)
Optional, if not provided, timestamps are returned in RFC3339 UTC with microsecond precision. (Note: the time offset notation can be 'Z' or '+00:00')
limit integer {..10000} Limit the number of data points to return per metric(default is 1000)
NOTE: When query datapoints with OR operator, the limit (default and maximum) will depends on number of OR tags have been provided. That is default = original default / number of OR tags
metrics [ string ] One or many metrics
end_time integer, string Exclusive UTC ending time range to query, also accept RFC3339 UTC string.
Supported units: u (microseconds), ms (milliseconds), s (seconds)
e.g. 1472547546000000u, 1472547546000ms, 1472547546s, 1472547546, 2016-08-30T08:59:06Z
Optional, if not provided, it will use current timestamp in microseconds from server side
order_by string Return results in ascending or descending time order.
Supported options: desc, asc
Default: "desc"
aggregate [ string ] One to many aggregation functions to apply.
Supported functions: avg, min, max, count, sum
String type value can only use count function
start_time integer, string Inclusive UTC starting time range to query, also accept RFC3339 UTC string.
Supported units: u (microseconds), ms (milliseconds), s (seconds)
e.g. 1472547546000000u, 1472547546000ms, 1472547546s, 1472547546, 2016-08-30T08:59:06Z, 2016-08-30T08:59:06+00:00
Optional, if not provided, it will be 7 days earlier than end_time.
relative_end integer, string A negative integer with time unit to indicate relative end time before now.
Supported units: u (microseconds), ms (milliseconds), s (seconds), m (minutes), h (hours), d (days), w (weeks)
sampling_size string The size of time slots used for downsampling. Must be a positive integer greater than zero.
Supported units: u (microseconds), ms (milliseconds), s (seconds), m (minutes), h (hours), d (days), w (weeks)
Optional, used together with fill arguments.
relative_start integer, string A negative integer with time unit to indicate relative start time before now.
Supported units: u (microseconds), ms (milliseconds), s (seconds), m (minutes), h (hours), d (days), w (weeks)
Default: -7d (last 7 days)

Responses

Error

Content: object The response to the caller

Name Type Description
error string Error Message in case of failure
result object Result message
Query results retrieved in merge mode.

Content: object Response data in merge mode.

Name Type Description
tags object The “AND” operator tags specified in query arguments.
values [ [ number, string, object ] ] The data points list.
columns [ string ] The column names for mapping the column in values property.
metrics [ string ] Metrics which are specified in query arguments.
Query results retrieved in split mode.

Content: object Response data in split mode.

Name Type Description
tags object The “AND” operator tags specified in query arguments.
values object The data points list for each metric, the property name is the metric name specified in query arguments.
values.^[a-zA-Z0-9_]+$ [ [ number, string, object ] ] The data points list.
columns [ string ] Its always empty array in this mode.
metrics [ string ] Metrics which are specified in query arguments.
Query results retrieved in merge mode with OR operation.

Content: object Response data in merge mode with OR operation.

Name Type Description
tags object The “AND” operator tags specified in query arguments.
values object The data points list of specific tag name, the property name is tag name.
values.^[a-zA-Z0-9_]+$ object The data points list of specific tag name and tag value, the property name is tag value.
values.^[a-zA-Z0-9_]+$.^[a-zA-Z0-9_]+$ [ [ number, string, object ] ] The data points list.
columns [ string ] The column names for mapping the column in values property.
metrics [ string ] Metrics which are specified in query arguments.
Query results retrieved in split mode with OR operation.

Content: object Response data in split mode with OR operation.

Name Type Description
tags object The “AND” operator tags specified in query arguments.
values object The data points list of specific tag name, the property name is tag name.
values.^[a-zA-Z0-9_]+$ object The data points list of specific tag name and tag value, the property name is tag value.
values.^[a-zA-Z0-9_]+$.^[a-zA-Z0-9_]+$ object The data points list for each metric, the property name is the metric name specified in query arguments.
values.^[a-zA-Z0-9_]+$.^[a-zA-Z0-9_]+$.^[a-zA-Z0-9_]+$ [ [ number, string, object ] ] The data points list.
columns [ string ] Its always empty array in this mode.
metrics [ string ] Metrics which are specified in query arguments.
Query results retrieved in aggregation mode.

Content: object Response data in split mode or aggregate funs specified in query arguments.

Name Type Description
tags object The “AND” operator tags specified in query arguments.
values object The data points list for each metric, the property name is the metric name specified in query arguments.
values.^[a-zA-Z0-9_]+$ object The aggregation of data points.
values.^[a-zA-Z0-9_]+$.avg number The average of metric values.
values.^[a-zA-Z0-9_]+$.max number The maximum of metric values.
values.^[a-zA-Z0-9_]+$.min number The minimum of metric values.
values.^[a-zA-Z0-9_]+$.sum number The sum of metric values.
values.^[a-zA-Z0-9_]+$.count number The count of metric values.
columns [ string ] Its always empty array in this mode.
metrics [ string ] Metrics which are specified in query arguments.

Example

-- Query by Absolute Time Constraint --
-- Get temperature and humidity data points between 2016-08-01 (inclusive) and 2016-09-01 (exclusive) from devices in Minneapolis city and US region
local metrics = {
  "temperature",
  "switch"
}
local tags = {
  region = "us",
  city = "minneapolis"
}
local out = Tsdb.query({
  metrics = metrics,
  tags = tags,
  start_time = "2016-08-01T00:00:00Z",
  end_time = "2016-09-01T00:00:00Z",
  fill = "null",
  limit = 50
})
response.message = out

-- Query by Relative Time Constraint --
-- Get temperature data points in recent 3 hours from devices in Taipei city and Asia region (with timestamp in milliseconds format)
local metrics = {"temperature"}
local tags = {
  region = "asia",
  city = "taipei"
}
local out = Tsdb.query({
  metrics = metrics,
  tags = tags,
  relative_start = "-3d",
  epoch = "ms",
  fill = "null",
  limit = 50
})
response.message = out

-- Query without Time Constraint --
-- Get most recent 5 temperature and humidity data points from devices in Minneapolis city and US region
local metrics = {
  "temperature",
  "humidity"
}
local tags = {
  region = "us",
  city = "minneapolis"
}
local out = Tsdb.query({
  metrics = metrics,
  tags = tags,
  limit = 5
})
response.message = out

-- Query by Downsampling --
-- Get humidity data points in recent two days from devices in Taipei city, downsampled by 4-hours time slots
local metrics = {
  "humidity"
}
local tags = {
  city = "taipei"
}
local out = Tsdb.query({
  metrics = metrics,
  tags = tags,
  relative_start = "-2d",
  sampling_size = "4h",
  fill = "none",
  epoch = "ms"
})
response.message = out

-- Aggregation by Downsampling --
-- Get average and count of tire pressure data between 2016-08-15 (inclusive) and 2016-09-01 (exclusive) from devices in Minneapolis city, downsampled by 30-minutes time slots
local metrics = {
  "tire_pressure"
}
local tags = {
  city = "minneapolis"
}
local aggregate = {"avg", "count"}
local out = Tsdb.query({
  metrics = metrics,
  tags = tags,
  start_time = "2016-08-01T00:00:00Z",
  end_time = "2016-09-01T00:00:00Z",
  aggregate = aggregate,
  sampling_size = "30m",
  fill = "none"
})
response.message = out
-- Query by Fill in Custom String --
-- To fill in the empty time slots with "Empty" string
local metrics = {
  "temperature",
  "switch"
}
local tags = {
  region = "us",
  city = "minneapolis"
}
local out = Tsdb.query({
  metrics = metrics,
  tags = tags,
  fill = "~s:Empty",
})
response.message = out
-- Query by OR tags operator --
-- Get temperature data points which belongs to (sn = dev1 or sn = dev2) and dev = pump
local metrics = {
  "temperature"
} local tags = {
  dev = "pump",
  sn = {"dev1", "dev2"}
}
local out = Tsdb.query({
  metrics = metrics,
  tags = tags,
  start_time = "2016-08-01T00:00:00Z",
  end_time = "2016-09-01T00:00:00Z",
  limit = 50
})
response.message = out


recent

Get the most recent data point of a particular set of metrics and tag values.

If you want to use advanced metric query, you need to specify an inner Lua table as an element
inside the Lua table of metrics parameter, i.e. metrics = {"m1","m2",{"m3","m4"}}
where {"m3","m4"} is an advanced metric query that query the most recent data
of m3 and also report the value of m4 that written together with m3.

Arguments

parameters - object - Object containing service call parameters.
Name Type Description
metrics [ object ] One or many metrics
tag_name string Tag name
tag_values [ integer, string ] One or many tag values

Responses

Operation successfully returned

Content: nil

Error

Content: object The response to the caller

Name Type Description
error string Error Message in case of failure
result object Result message

Example

-- Get latest data of metric vibration and humidity from devices with tag sn=123 or sn=456
local out = Tsdb.recent({
  metrics = {"vibration","humidity"},
  tag_name = "sn",
  tag_values = {"123", "456"}
})
response.message = out

-- Get latest data of metric warning and critical from devices with tag sn=123 or sn=456
-- Together with the corresponding text for warning and critical metrics
local out = Tsdb.recent({
  metrics = {{"warning", "text"}, {"critical", "text"}},
  tag_name = "sn",
  tag_values = {"123", "456"}
})
response.message = out


write

Write data point to one or many metrics with an optional set of tags and a timestamp down to microsecond precision.

Note that if multiple data points are written with exactly the same timestamp, only the last one will be kept and it overwrites the others.

Each metric value has a limited size which depends on the number of tags. (Number of tags + 1) multiplies the size of metric value can't over 480KB. A write request will be rejected without partial writes if exceeding the limit.

If succeeds, it returns a json of write timestamp in microseconds.

Arguments

parameters - object - Object containing service call parameters.
Name Type Description
ts integer, string Unix timestamp in microseconds used as the write time for given data point.
Supported units: u (microseconds), ms (milliseconds), s (seconds)
e.g. 1472547546000000u, 1472547546000ms, 1472547546s, 1472547546
If the unit is not provided, it will be chosen by the number of digits to avoid the user from unintentionally using unusual UTC timestamps.
For seconds precision, the length of UTC timestamp, it is always 10 digits in between 2001/09/09 (9 digits) and 2286/11/20 (11 digits), which we assume that the "real data" be written is in the range of 10 digits.
Similarly, for milliseconds (seconds times 10^3) and microseconds (seconds times 10^6), the reasonable number of digits are 13 and 16 respectively.
For example, the unit of 1472547546 would be second and 14725475460 is invalid.
Valid time range: 1,000,000,000,000,000(us) to 9,999,999,999,999,999(us) unix timestamp.
Optional, if not provided, it will use the received time in microseconds from server side
tags object Pairs of tag and its tag value (only text supported).
Maximum size of tag name and tag value: 1KB.
Maximum number of tags in a single write: 20
metrics object Pairs of metric name and its value.
Maximum size of metric name: 1KB.
Maximum number of metrics in a single write: 100
return_ts boolean Whether to return write timestamp in the response

Responses

Data successfully inserted

Content: object The response of write

Name Type Description
write_timestamp string The timestamp of data point written to TSDB (in microseconds)
Data successfully inserted

Content: nil

Error

Content: object The response to the caller

Name Type Description
error string Error Message in case of failure
result object Result message

Example

-- Write data point of metrics with tags
-- If timestamp is not provided, it will use the received time in microseconds from server side
local metrics = {
  temperature = 37.2,
  humidity = 73,
  switch = "on",
  host = "8.168.1.24:443"
}
local tags = {
  pid = "pzomp8vn4twklnmi",
  identity = "000001",
  region = "us",
  city = "minneapolis"
}
local out = Tsdb.write({
  metrics = metrics,
  tags = tags
})
response.message = out

-- Write data points of metrics with tags and timestamp
local metrics = {
  temperature = 37.2,
  humidity = 73
}
local tags = {
  identity = "000002"
}
local out = Tsdb.write({
  metrics = metrics,
  tags = tags,
  ts = "1476243965s"
})
response.message = out

Events


exportJob

An event message containing the export task result.

Arguments

job - object - The information for export job
Name Type Description
error string Error message if job failed
query object Query arguments
state string State of the job (enqueued, expired, in-progress, completed or failed)
format object The data format rules. Property name should be the field name which are the metrics name, "timestamp" and "tags".
format.^[a-zA-Z0-9_]+$ [ object ] Functions to format the this field. the rules will be applied by its order in array.
format.^[a-zA-Z0-9_]+$[].label string Append the given string to field value. Metrics field only.
format.^[a-zA-Z0-9_]+$[].round integer {..15} Round the field value with given value. Metrics field only.
format.^[a-zA-Z0-9_]+$[].rename string Rename a field to the given value. Metrics field only.
format.^[a-zA-Z0-9_]+$[].discard boolean Remove the field when the value is true. The field parameter should be "tags" and only support "tags" field.
format.^[a-zA-Z0-9_]+$[].replace object Replace the field value which matching the pattern to the new value. Metrics field only.
format.^[a-zA-Z0-9_]+$[].replace.to string The replacement value. Using \{n} to specify capture group. {n} is number of group.
format.^[a-zA-Z0-9_]+$[].replace.match string String or regular expression.
format.^[a-zA-Z0-9_]+$[].datetime integer Convert the unix timestamp to human readable format(support ISO 8601: yyyy-mm-ddThh:mm:ss.[mmm]) for given value as UTC offset hours(n or -n).
For example the timestamp 1509437405123 will be converted to 2017-10-31T08:10:05.123Z when value is 0. Timestamp and metrics field only.
maximum: 14
minimum: -12
format.^[a-zA-Z0-9_]+$[].normalize [ string ] {..20} Normalize the given list of tag names, it will fill into different columns. The tags are not specified will be dropped.
The field parameter should be "tags" and only support "tags" field.
job_id string Job ID
length string The total length of export file in bytes
filename string File name of the exported CSV file
content_id string Content ID of the job to Content service
context_id string Solution id
start_time string Start time of the job
update_time string Last updated time of the job

Example

function handle_tsdb_exportJob (job)

 -- Your logic comes here 

end