score:1

Accepted answer

You don't actually need to pass the datetime for each data point in your series. You only need to pass the start datetime of the series and then use pointStart to let the chart know when to start counting, like so:

series: [{
    name: 'Wind Speed',
    data: [0.52, 0.778, 0.746, 0.594, 0.716, 0.793, 0.648, 0.828, 0.202, 0.066, 0.116, 0.116, 0.17, 0.195, 0.051, 0, 0.368, 2.365, 2.841, 2.693, 2.416, 2.541, 2.429, 2.888],
    pointStart: 1360893600000,
    pointInterval: 3600000
}]

See in action at: http://jsfiddle.net/Reality_Extractor/pNFYL/

pointStart expects Unix time though there are many different ways how you could easily come up with the particular time you need. I just hardcoded it in the example for demonstration purposes.

When you use pointStart then pointInterval is necessary to accurately assign time to the datapoints. It's worth noting that this only works with data which has regular pointIntervals. If you have irregular data you do need to supply a datetime for each datapoint.

This doesn't quite answer your question about passing the time as an array and then assigning it to the xAxis, but I do believe that the pointStart approach is less complex than array assignments.

Also, somewhat related, depending on how often your data updates you should be caching it on the server, and also the client, to prevent unnecessary refreshes over the network if the data didn't actually change.


Related Query

More Query from same tag