Datafile Source - Datetime is in UTC
-
I'm receiving "Illegal instant due to time zone offset transition" on my file imports which I can see is caused by the transition to/from daylight savings time. My values are in UTC but I'm not sure how to set that in the import. My code is below. As you can see, I tried simply skipping the lines due to it being 2am but then it throws an error with the NumericImportPoint as time isn't set. How could I set the timezone as UTC?
//import java.util.HashMap; import java.util.Map; import org.joda.time.format.DateTimeFormat; import org.joda.time.format.DateTimeFormatter; import org.joda.time.IllegalInstantException; import com.infiniteautomation.datafilesource.contexts.AbstractCSVDataSource; import com.infiniteautomation.datafilesource.dataimage.NumericImportPoint; /** * Importer that will import data on a point per column basis * The first column is the Date * * @author Terry Packer * */ public class pxm4000_energyLog extends AbstractCSVDataSource { private DateTimeFormatter dtf = DateTimeFormat.forPattern("MM/dd/yyyy HH:mm"); // private DateTimeFormatter dtf = DateTimeFormat.forPattern("MM/dd/yyyy"); private Map<Integer, String> headerMap = new HashMap<Integer, String>(); @Override public void importRow(String[] row, int rowNum) { //Strip out the header row, it does not contain our data if(rowNum == 0){ for(int k = 0; k < row.length; ++k) { this.headerMap.put(k, row[k]); } }else{ //Column 0 is the time // try { long time = dtf.parseDateTime(row[0]+' '+row[1]).getMillis(); // } catch(IllegalInstantException e) { // return; //Don't let the exception escape // } //Empty additional parameters Map<String, String> extraParams = new HashMap<String,String>(); //For each additional column we will create an Import Point for(int i=2; i<row.length; i++){ String identifier = headerMap.get(i); //Get the identifier from our header map double value = Double.parseDouble(row*); //Create the value NumericImportPoint point = new NumericImportPoint(identifier, value, time, extraParams); this.parsedPoints.add(point); } } } }
-
Get the millis after you do your filtering. Do this:
String ts = row[0]+' '+row[1]; //timestring long dt; if(ts.matches("^09/29/2019 02:[0-5][0-9]") || ts.matches("^09/30/2018 02:[0-5][0-9]") || ts.matches("^09/25/2016 02:[0-5][0-9]") || ts.matches("^09/24/2017 02:[0-5][0-9]") || ts.matches("^09/29/2013 02:[0-5][0-9]") || ts.matches("^09/28/2014 02:[0-5][0-9]") || ts.matches("^09/27/2015 02:[0-5][0-9]") || ts.matches("^09/26/2010 02:[0-5][0-9]") || ts.matches("09/^25/2011 02:[0-5][0-9]") || ts.matches("^09/30/2012 02:[0-5][0-9]") ) { return; } dt = dtf.parseDateTime(ts).getMillis();
I'm afraid I don't know of another way around it, at least it hasn't come to me yet since when we do the time change, the hour is lost for the 2AM period. So that means you'd be doubling up with timestamps going over one another. You'd need to amend my code as I'm not sure what month it applies to. I'm in the southern hemisphere. Since you're posting about this now I'll wager you're in the north the month should be march. I'll do some further googling..
-
@mattfox said in Datafile Source - Datetime is in UTC:
02:[0-5][0-9]"
Here's the error code when I inserted that code.
Event from import class: org.joda.time.IllegalInstantException: Cannot parse "03/08/2015 02:00": Illegal instant due to time zone offset transition (America/New_York)I simplified it down to the following to be sure I wasn't missing something obvious. Any thoughts?
String ts = row[0]+' '+row[1]; //timestring long dt; if(ts.matches("^03/08/2015 02:[0-5][0-9]")) { return; } dt = dtf.parseDateTime(ts).getMillis();
-
Let's print your timestamp to make sure i've got the format right:
System.out.print("Timestring: " + ts+ "\n");
-
To use a specific Timezone:
DateTimeFormatter dateTimeFormatter = DateTimeFormat.forPattern("MM/dd/yyyy HH:mm").withZone(DateTimeZone.UTC);
-
Thanks Terry, was busy googling this
-
@terrypacker said in Datafile Source - Datetime is in UTC:
.withZone(DateTimeZone.UTC);
Terry Packer for the win! Thanks for the amazing help.