How to change the http sending value parameter
-
Hi Jose,
If you need to do something like that you'll want to use the scripting data source as i described in the paragraph starting with "Alternatively (and perhaps more straightforwardly)" as you could simply call 'dataOut.set( )' twice
Alternatively, you could have a second point link which only returns a value in that special case that you need to set two values.
-
@phildunlap said in How to change the http sending value parameter:
Hi jmatos,
I'm not sure I completely understand, but it sounds like Point Links may work well for you. It sounds like you want to get values from lots of points and publish them all from the same point. Assuming this is for the publisher, a solution could be something like....
- Create a virtual point named 'Alarm Out' with an XID of 'almnr' of type alphanumeric
- Create a global script, having the body:
function getPrefix() { /*So that you may change the prefix for all scripts later if required */ return "4"; }
- Create a point link '1002 Alarm' from source 'CDI1 – LP_SENSOR_MANUAL_1002/1_1/4194279' to target 'Alarm Out', with the script body:
return getPrefix() + "1002" + source.value;
- Create similar point links to 'Alarm Out' from all other points you will publish this way.
Note that the 1002 is hardcoded into the script, since there will be 1 point link from each data point you're publishing here to our 'Alarm Out' point.
If you have only a handful to set up, you can do it by hand. If you have hundreds, I can provide a script to generate the point links for you. A helpful regex for parsing out those DeviceID values could be:
^[^/]+_(\d+)/
Hi phildunlap,
Now that we have buy at IAS the Mango solution we gonna step into the full configuration.
As you say in the last paragraph above, you can provide us a script to generate all the points.
How could this be done?
Thank you in advance,
Jose -
Hi Jose,
I'm sure I had a clearer picture in my mind of what I would need to write you when I said that. But, here's a python script to generate JSON you can import on the import/export page. You'll need to edit the paths in the script if you want to use it, and it'll intake a CSV file like:
DP_123456,1003 DP_654321,1004
and output JSON for point links you can import. You may also need to modify the target point XID or other things like that. Here's the python:
import json from StringIO import StringIO #use double slashes for windows, i.e. "C:\\path\\to\\new-point-links.csv" csvPointXids = open("/path/to/new-point-links.csv") #I will treat this as though it's a CSV with two columns, 0 is source point XID, 1 is the code for the point link body, like 1002 basePointLink = """{ "xid":"PL_17-3-1_gen_%(loopCounter)d", "sourcePointId":"%(sourceXid)s", "targetPointId":"almnr", "event":"CHANGE", "logLevel":"NONE", "disabled":false, "script":"return getPrefix() + '%(outputNumber)s' + source.value", "scriptPermissions":{ "customPermissions":"", "dataPointReadPermissions":"superadmin", "dataPointSetPermissions":"superadmin", "dataSourcePermissions":"superadmin" }, "writeAnnotation":false }""" #uncomment this if you have a header line to consume #csvPointXids.readline() outputConfig = {"pointLinks":[]} loopCounter = 1 for line in csvPointXids.readlines() : data = line.replace("\r","").replace("\n","").split(",") if len(data) < 2 : continue pointJson = basePointLink % {"loopCounter": loopCounter, "sourceXid": data[0], "outputNumber": data[1]} #print pointJson outputConfig["pointLinks"].append( json.load( StringIO( pointJson ) ) ) loopCounter += 1 csvPointXids.close() #You need to edit this output path as well. C:\\path\\to\\output.json output = open("/path/to/output.json", "w+") output.write( json.dumps( outputConfig, indent=4, sort_keys=False, separators=(",",":") ) ) output.close()
-
Hi phildunlap,
:-( Sorry I can't figure out two things:
A) Where do I run this code?
- I have run it in my windows python (3.6) but the only out put is:
{ "pointLinks",[] }
B) After I run it I have to import the json file to Mango...?!
This is mainly because I can't understand how the code of basePointLink will pass to json... the previous pointLinks is empty.Thank you,
Jose -
Hi Jose,
I would bet that means there are no lines in your CSV or there is not a comma in the line separating the two fields. You can add a statement like
print line
to the
for line in csvPointXids.readlines() :
block before theif len(data)
. to see it print at the command line. If there are no lines in your file you will see no output. I think if you have the file path wrong in the initial open() it will error.It looks like I got the separators backwards (should be
separators=(",",":")
, so I edited that to fix. I did test that the script generates JSON (but I didn't import the json) -
It's working!
Thank you very much. :-)
-
Hi phildunlap,
How do I stop "Failed to send email (.)" messages? Is there any way to disable the email service on settings? I don't need it.
thank you
-
Hi jmatos,
The event level is set in the "System event alarm levels" section of the system settings as "Email send failure"
You can change this to "Do not log" (event handlers of the "Email send failure" event will be notified, but nothing will be stored in the database) or "Ignore" (nothing will happen).
I would wonder why you're sending emails if they're failing and you do not want them? Did you create an email event handler, or have you set your user to receive event emails above a certain threshhold?
-
Hello,
I have a DataSource with several (hundreds) DataPoints that I have imported and I want to discard. Which is the better an quick way to do it?
Thanks in advance,
Jose -
Hi Jose,
You can delete the data source if it's all the points on the data source.
You can use the JSON from your import and this script to generate SQL delete statements:
#!/bin/python import json importedPoints = open("/path/to/import.json") imprt = json.load(importedPoints) importedPoints.close() outputFile = open("/path/to/output.sql", "w+") for dp in imprt["dataPoints"] : outputFile.write("DELETE FROM dataPoints WHERE xid='"+dp["xid"]+"';\n"); outputFile.close();
Next step would be running the SQL statements in an sql console.
-
Sorry I do not understand where I "choose" the points to be deleted.
In my mind I have (certainly wrong) exported the DS into an csv file and then in Excel I run a function to remove the DataPoints that I do not want. I was hopping that Mango could inversely import this kind of file.
-
There is no way to delete points through import/export functions. You can generate the SQL from a CSV really easily, though, just get the XID from a column in the CSV...
#!/bin/python xidColumn = 3 #change this! points = open("/path/to/points.csv") outputFile = open("/path/to/output.sql", "w+") for line in points.readlines() : data = line.replace("\r","").replace("\n","").split(",") outputFile.write("DELETE FROM dataPoints WHERE xid='"+data[xidColumn]+"';\n"); outputFile.close();
-
The project is now installed and working on the client. We are in the real test and pay attention to how it behaves.
To try workaround the issue on C) (c) - In the Virtual DATASOURCE there is one "update period" that makes the systems send the last alarm in that period time. Is there some way to deactivate it?) I set the update period to 300000h (about 34 years). Do you see any inconvenience on that?
Thank you very much
-
Yeah that's fine. What'll happen is every time the data source is enabled or disabled it will poll once, and then it'll be negligible load on the system.
In Mango 3 I added a feature to the virtual data source to disable its polling entirely.
-
Hi Phildunlap,
As you know the solution is already installed at client facilities.
unfortunately I have cross with a "huge" problem that I can't figure out. :-(
When there is more than two (2) - say six - simultaneous alarm I can't get them all out on the http publisher. The alarms are registered in historic but only two reach the destination. It seems that they are lost between the publisher and from this to network. I've used wireshark and I dont see they getting out.
I've try to increase the cache size (is necessary to restart the all mango instance or a simple save is enough?)
May you please help me on this one?
Thanks in advance,
Jose -
@jmatos said in How to change the http sending value parameter:
Hi Phildunlap,
As you know the solution is already installed at client facilities.
unfortunately I have cross with a "huge" problem that I can't figure out. :-(
When there is more than two (2) - say six - simultaneous alarm I can't get them all out on the http publisher. The alarms are registered in historic but only two reach the destination. It seems that they are lost between the publisher and from this to network. I've used wireshark and I dont see they getting out.
I've try to increase the cache size (is necessary to restart the all mango instance or a simple save is enough?)
May you please help me on this one?
Thanks in advance,
JoseDo you have any answer to me regarding this issue?
-
Hi jmatos,
I would wonder,
- What is the update event for the publisher?
- What is the logging type for the Alarm out point?
- Are you using the NoSQL database?
My expectation is that the logging type for the alarm point is discarding some seemingly-backdated data in the concurrent update situation, or possible that the data is occurring at the same millisecond, and the NoSQL database can't store more than one value at one timestamp.
-
Hi phildunlap
- All
- When point value changes
- I think so (How can I be sure?)
More info: When I go to menu [Data Point Details] history all the alarm out are registered OK.
Here on lab I try an simulated an alarm out send through the http publisher to the same destination software with a virtual source generating a random number every 1ooms and all them reach the destination.
Tomorrow I'll go to the client and try this virtual send.
updating after visit the client:
Test results with real BACnet objects on client:
At two times I had to restart Mango and five of five(!) alarms get to the destination! But just on the startup... :-( All the test afterwards only get the "old same" two.So, the alarms are registered at entry level and reported in the Data Point Details. But only two reach the destination.
I think you maybe right concerning the same millisecond, but in this case something has to explain the five-of-five that reach the destination on startup.
Is there some delay we can insert to workaround the possible same timestamp?
Important: As I told you before, I'm not sure what DB is in use. How can I be sure and if not using NoSQL how do I change it? the problem can be in here... :-(
Thanks in advance,
Jose -
Hmm.
Before working on a workaround for the same milliseconds, let's try to confirm that's actually occurring (but maybe this would be the workaround). One thing you could try is putting a couple manual offsets in the point links' times, like:
TIMESTAMP = source.time+1; //one millisecond later than the source's TS //you could use +2 on another one, and so on
It's possible on startup it registered a COV notification, got that point values (everything would have different timestamps), but for some reason reverted to polling those points. In which case, all new values would have the timestamp of the poll.
-
Hi Phil,
Where do I put it? Sorry but I can figure it out. Probably because I have 1800 point links... I'm lost here. :-(
As instructed previously by you I put a Global script
function getPrefix() { /*So that you may change the prefix for all scripts later if required */ return "4"; }
this is running on all point links. Is any way to add the time code here? Or am I confused? (its not the first time :D )
thank you