Multiplicand and Augend on HTTP Receiver
-
We are implementing a transparent BLE - HTTP Receiver solution using Raspberry Pi. So the Pi will transmit all bluetooth values that it receive. There is a combination of water, beer, electricity, etc pulses that are transmitted. However on some flow meters it is 5 pulses per litre, sometimes 10 pulses per kWh, and so on. It will become a complete mess if we have to configure the Pi or the bluetooth device in the field that is a $10 part with "real world translation"
So we would like to configure Multiplicand and Augend (optional) on the back-end.
Is there a way to do this without having to create 2 datapoints (point link or virtual data point) for each sensor?
SNMP works well, but many raspberry pi's are behind networks that only allow HTTP traffic. So we have to use HTTP Receiver.
-
Hi glamprecht,
Is there a way to do this without having to create 2 datapoints (point link or virtual data point) for each sensor?
If one makes ones' points HTTP receiver points, then as it is currently, no. We have talked about adding multipliers and augends to all points, but it's not seemed really necessary on many data source types yet, so there's been no push to do them all.
I would consider using an HTTP receiver to intake the data (one could also use a virtual serial server socket and a serial data source), but capturing it all into the same point. Then I would have a scripting data source (but one could use a point link or meta point, too) triggered by that point, which parses and transforms the incoming message (and you could have it create any points that were needed, too).
Do you have control over how it formats the HTTP message? Would you like a skeleton of such a script?
-
Yes I have control over the HTTP messages, so I can add anything to that, I would appreciate help with the script. Thank you
-
@phildunlap said in Multiplicand and Augend on HTTP Receiver:
(one could also use a virtual serial server socket and a serial data source)
That's how I'd do it, use socat to parse and send the data and then a Regex on the datasource end.
The beauty of socat is you can even make it read from script outputs on the pi itself and send the modified data to a point in the mango system. But that's just my take on how I'd approach it. (keep all the nitty gritty on the pi)Also note it's fair that if you need to modify values, it's best to keep it consistent either at the source or in mango itself. That will remove any uncertainty on what's being fired in should you be sending data from another location/source down the line.
Good luck
Fox
-
I have for example 15 BLE devices asynchronously transmitting their values through BLE advertisements (not being connected) every 5 seconds. I have a python scanner that get the value and spawns off a thread to post its value with the MAC to Mango.
If I need to add a new sensor in Mango, I just need to enter the MAC in the datapoint and bring the BLE device within 100m of any PI and it works. I can also build redundancy by having multiple PI's in reach of each other without any extra coding. I dont even have to add/register the PI on Mango. It is completely transparent. And all I need to do to add another node is copy the PI sdCard and put it in another PI. The elegance of the system is great, except the real world values is not so elegant at the moment.
Wont using virtual serial port limit this connection to 1 stream, ie 1 PI ?
-
Wont using virtual serial port limit this connection to 1 stream, ie 1 PI ?
Yes, good suspicion!
Yes I have control over the HTTP messages, so I can add anything to that,
Great! So I would set up an HTTP receiver with only one point, Let's say its parameter name is 'data' (or maybe just d, for short). Then I'll publish to the receiver messages like
POST / HTTP/1.1 User-Agent: Mango M2M2 HTTP Sender publisher Content-Length: 31 Content-Type: application/x-www-form-urlencoded Host: 127.0.0.1 Connection: Keep-Alive Expect: 100-continue Accept-Encoding: gzip,deflate data=s1z56.78+s2z78.90+s3z12.34
And put the http receiver point with Parameter name "data" into the context of a script like,
var sensorTransforms = this.sensorTransforms; if(typeof this.sensorTransforms === 'undefined') { this.sensorTransforms = sensorTransforms = { //Use this map of sensor identifiers to transformation functions to // manipulate the value of the data s1: function(val) { return val*2 + 3; }, s2: function(val) { return val*6 - 12; }, s3: function(val) { return Math.pow(val, 2); } }; } function createPoint(identifier) { var newDp = JSON.parse(JsonEmport.dataPointQuery( 'eq(xid,DP_BaseQueueProcessorNumeric)')).dataPoints[0]; newDp.enabled = true; newDp.pointLocator.varName = identifier; newDp.name = "Sensor " + identifier; delete newDp.xid; //To generate a new XID, //or encode your own XID for reference elsewhere JsonEmport.doImport( JSON.stringify({"dataPoints": [newDp]}) ); } //Because there is a slight asynchrony, we need to track where we got to in the data // queue, so we'll use a context point for that. var unprocessedMessages = data.pointValuesSince(tracker.time); var lastTs = 0; for(var k = 0; k < unprocessedMessages.length; k+=1) { var sensorData = unprocessedMessages[k].stringValue.split(" "); for(var j = 0; j < sensorData.length; j+=1) { var pointData = sensorData[j].split("z"); if(typeof this[pointData[0]] === 'undefined' ) { createPoint(pointData[0]); } if( pointData[0] in sensorTransforms ) this[pointData[0]].set( sensorTransforms[pointData[0]](Number(pointData[1])) ); else this[pointData[0]].set( Number(pointData[1]) ); } lastTs = unprocessedMessages[k].time; } //Update the timestamp we have processed unto if(lastTs !== 0) tracker.set(!tracker.value, lastTs);
And here's the JSON for the script and the base points (you will probably need to define a cron for the scripting data source, as the feature to have a scripting data source only be driven by context point updates has been added to 3.5 (which should see some sort of alpha release today or very soon). You would probably also need to create a "Numeric_All-Data" data point template. It also has the JSON for the HTTP receiver I tested with
{ "dataSources":[ { "xid":"DS_306b3fdc-b914-4187-9a4c-0024be71d52e", "name":"DataQueueProcessor", "enabled":false, "type":"SCRIPTING", "alarmLevels":{ "SCRIPT_ERROR":"URGENT", "DATA_TYPE_ERROR":"URGENT", "POLL_ABORTED":"URGENT", "LOG_ERROR":"URGENT" }, "purgeType":"YEARS", "updateEvent":"CONTEXT_UPDATE", "context":[ { "varName":"data", "dataPointXid":"DP_97ba3ae9-f9ca-4b36-a557-b07fc89824f1", "updateContext":true } ], "logLevel":"NONE", "cronPattern":"", "executionDelaySeconds":0, "historicalSetting":false, "script":"var sensorTransforms = this.sensorTransforms;\nif(typeof this.sensorTransforms === 'undefined') {\n this.sensorTransforms = sensorTransforms = {\n \/\/Use this map of sensor identifiers to transformation functions to\n \/\/ manipulate the value of the data\n s1: function(val) { return val*2 + 3; },\n s2: function(val) { return val*6 - 12; },\n s3: function(val) { return Math.pow(val, 2); }\n };\n}\n\nfunction createPoint(identifier) {\n var newDp = JSON.parse(JsonEmport.dataPointQuery(\n 'eq(xid,DP_BaseQueueProcessorNumeric)')).dataPoints[0];\n newDp.enabled = true;\n newDp.pointLocator.varName = identifier;\n newDp.name = \"Sensor \" + identifier;\n delete newDp.xid; \/\/To generate a new XID,\n \/\/or encode your own XID for reference elsewhere\n JsonEmport.doImport( JSON.stringify({\"dataPoints\": [newDp]}) );\n}\n\n\/\/Because there is a slight asynchrony, we need to track where we got to in the data\n\/\/ queue, so we'll use a context point for that.\nvar unprocessedMessages = data.pointValuesSince(tracker.time);\nvar lastTs = 0;\nfor(var k = 0; k < unprocessedMessages.length; k+=1) {\n var sensorData = unprocessedMessages[k].stringValue.split(\" \");\n for(var j = 0; j < sensorData.length; j+=1) {\n var pointData = sensorData[j].split(\"z\");\n if(typeof this[pointData[0]] === 'undefined' ) {\n createPoint(pointData[0]);\n }\n if( pointData[0] in sensorTransforms )\n this[pointData[0]].set( sensorTransforms[pointData[0]](Number(pointData[1])) );\n else\n this[pointData[0]].set( Number(pointData[1]) );\n }\n lastTs = unprocessedMessages[k].time;\n}\n\n\/\/Update the timestamp we have processed unto\nif(lastTs !== 0)\n tracker.set(!tracker.value, lastTs);\n", "scriptPermissions":{ "customPermissions":"", "dataPointReadPermissions":"superadmin,superadmin", "dataPointSetPermissions":"superadmin,superadmin", "dataSourcePermissions":"superadmin,superadmin" }, "editPermission":"", "purgeOverride":false, "purgePeriod":1 }, { "xid":"DS_bcd20bf3-a723-4631-a5ce-77f409e54a46", "name":"DataQueueReceiver", "enabled":true, "type":"HTTP_RECEIVER", "alarmLevels":{ "SET_POINT_FAILURE":"URGENT" }, "purgeType":"YEARS", "setType":"PUBLISHER", "dateFormat":"DATE_FORMAT_BASIC", "deviceIdWhiteList":[ "*" ], "ipWhiteList":[ "*.*.*.*" ], "setPointUrl":"", "editPermission":"", "purgeOverride":false, "purgePeriod":1 } ], "dataPoints":[ { "xid":"DP_97ba3ae9-f9ca-4b36-a557-b07fc89824f1", "name":"Data Message", "enabled":true, "loggingType":"ALL", "intervalLoggingPeriodType":"MINUTES", "intervalLoggingType":"INSTANT", "purgeType":"YEARS", "pointLocator":{ "dataType":"ALPHANUMERIC", "binary0Value":"", "includeTimestamp":true, "parameterName":"data", "setPointName":"", "settable":false }, "eventDetectors":[ ], "plotType":"STEP", "rollup":"NONE", "unit":"", "simplifyType":"NONE", "chartColour":"", "chartRenderer":{ "type":"TABLE", "limit":10 }, "dataSourceXid":"DS_bcd20bf3-a723-4631-a5ce-77f409e54a46", "defaultCacheSize":1, "deviceName":"DataQueueReceiver", "discardExtremeValues":false, "discardHighLimit":1.7976931348623157E308, "discardLowLimit":-1.7976931348623157E308, "intervalLoggingPeriod":15, "intervalLoggingSampleWindowSize":0, "overrideIntervalLoggingSamples":false, "preventSetExtremeValues":false, "purgeOverride":false, "purgePeriod":1, "readPermission":"", "setExtremeHighLimit":1.7976931348623157E308, "setExtremeLowLimit":-1.7976931348623157E308, "setPermission":"", "tags":{ }, "textRenderer":{ "type":"PLAIN", "useUnitAsSuffix":true, "unit":"", "renderedUnit":"", "suffix":"" }, "tolerance":0.0 }, { "xid":"DP_60527aba-6569-451a-91a7-9b1c1c940811", "name":"Queue Timestamp Tracker", "enabled":true, "loggingType":"ON_CHANGE", "intervalLoggingPeriodType":"MINUTES", "intervalLoggingType":"INSTANT", "purgeType":"YEARS", "pointLocator":{ "dataType":"BINARY", "contextUpdate":false, "settable":true, "varName":"tracker" }, "eventDetectors":[ ], "plotType":"STEP", "rollup":"NONE", "unit":"", "templateXid":"Binary_Default", "simplifyType":"NONE", "chartColour":"", "chartRenderer":{ "type":"TABLE", "limit":10 }, "dataSourceXid":"DS_306b3fdc-b914-4187-9a4c-0024be71d52e", "defaultCacheSize":1, "deviceName":"DataQueueProcessor", "discardExtremeValues":false, "discardHighLimit":1.7976931348623157E308, "discardLowLimit":-1.7976931348623157E308, "intervalLoggingPeriod":15, "intervalLoggingSampleWindowSize":0, "overrideIntervalLoggingSamples":false, "preventSetExtremeValues":false, "purgeOverride":false, "purgePeriod":1, "readPermission":"", "setExtremeHighLimit":1.7976931348623157E308, "setExtremeLowLimit":-1.7976931348623157E308, "setPermission":"", "tags":{ }, "textRenderer":{ "type":"BINARY", "oneColour":"black", "oneLabel":"one", "zeroColour":"blue", "zeroLabel":"zero" }, "tolerance":0.0 }, { "xid":"DP_BaseQueueProcessorNumeric", "name":"Base Numeric Sensor Point", "enabled":false, "loggingType":"ALL", "intervalLoggingPeriodType":"MINUTES", "intervalLoggingType":"AVERAGE", "purgeType":"YEARS", "pointLocator":{ "dataType":"NUMERIC", "contextUpdate":false, "settable":true, "varName":"notEnabledBaseDataPoint" }, "eventDetectors":[ ], "plotType":"SPLINE", "rollup":"NONE", "unit":"", "templateXid":"Numeric_All_Data", "simplifyType":"NONE", "chartColour":"", "chartRenderer":{ "type":"IMAGE", "timePeriodType":"DAYS", "numberOfPeriods":1 }, "dataSourceXid":"DS_306b3fdc-b914-4187-9a4c-0024be71d52e", "defaultCacheSize":1, "deviceName":"DataQueueProcessor", "discardExtremeValues":false, "discardHighLimit":1.7976931348623157E308, "discardLowLimit":-1.7976931348623157E308, "intervalLoggingPeriod":1, "intervalLoggingSampleWindowSize":0, "overrideIntervalLoggingSamples":false, "preventSetExtremeValues":false, "purgeOverride":false, "purgePeriod":1, "readPermission":"", "setExtremeHighLimit":1.7976931348623157E308, "setExtremeLowLimit":-1.7976931348623157E308, "setPermission":"", "tags":{ }, "textRenderer":{ "type":"ANALOG", "useUnitAsSuffix":true, "unit":"", "renderedUnit":"", "format":"0.00" }, "tolerance":0.0 } ] }
Choosing space and the letter z for my value delimiters was arbitrary (but saved me some HTTP encoding in my nc testing by hand). You can use anything properly query-string-encoded as your delimiters except the
@
which is used when timestamps are transmitted. You would just need to adjust the script accordingly.Also I should mention there's an outstanding issue where HTTP receivers don't get messages coming in over IPv6. This means prefer 127.0.0.1 to localhost, as localhost may be ::1 and get bounced by the whitelist. https://github.com/infiniteautomation/ma-core-public/issues/107
Sounding like a reasonable direction?