Graphite- sending metrics from injection moulding machines - graphite

I love grafana and metrics. I was to get metrics from big mechanical devices. Example plastic injection moulding machines. They have Plc devices. Anyone have any idea on how I would start this off?
Thanks
Trev

If there is a way to get the metrics out of the PLC devices, then the simplest way to get started will be to write a small script that gathers the information you're interested in and feeds it into a Time Series Database like Graphite or InfluxDB.
This can be as simple as a script run via cron, in whatever language you're most comfortable with.
Once you have your measurements getting fed into the TSDB, then you can use Grafana to graph them over time.
Here's a really simple example I wrote to pull stats from a HomeGenie home automation system and push them into Graphite: https://gist.github.com/DanCech/348d37ee45898b34abd3

Related

How to achieve fast video transmission between esp32 cam to a pc?

To make an autonomous drone and help it fly from one place to another without any human intervention, I plan to send data using two ESP32s and receive those data using my PC. I won't use the onboard navigation system inside the drone. I need suggestions regarding the signal-receiving unit (using wifi i will send the data), which will feed information directly to my PC. Kindly give me information on the best way to receive signals from ESP32 to PC, in the shortest amount of time and with a reasonable budget.
We were thinking about NodeMCU, but we aren't going to use it. Moreover, I am in the initial phase and have just started researching this. Also, I am pretty new to this, so any suggestions will help me to discover and also help me get what I want.

Most efficient way to transfer images continuously across the network

I have an architecture in which there are separate services which run independently. My one of the service is continuously fetching frames from camera and sending them to another service which performs some processing on the frame (like face detection, face recognition etc) and sends back the results. Services can be run on different machines.
Please suggest any good library or something which is fast at transferring frames between services. I have already a few options in my mind like Kafka and ZeroMq but I am also confused between them, which to choose.
Any good pipeline design is also welcome. Thanks
After experimenting with the Kafka pipeline in a very similar scenario my suggestion is that Kafka is not a suitable choice for it. Since the frequency and size of messages are the concern here while dealing with the live image streams.
The choices that can be tested are:
Zero MQ
Rabbit MQ
Zero MQ will highly likely perform well in the given scenario.

Solarwinds SWQL to Query Data for Routers and Switches

Does anybody know where to find Sample Solarwinds SWQLs to get Health Data for Routers and Switches? If anybody can post any samples, would be of great help.
It's plausible that the easiest way to retrieve router/switch health data may be by configuring OID/MIB imports using the Universal Device Poller (UnDP) tool available on your primary poller.
Simple google searches can get you to recent unified MIB listings, which you can collect from the target devices and display in node-related pages of your discretion and design.
It's admittedly tedious, but once they're in they're good until you replace the hardware, so long as you're willing to commit to SNMP polling of the device. Further the tool allows you to perform trial-and-error testing before committing the MIB import you're working on. Note for reference I'm referring to NPM11.5/12. Earlier versions should have this tool up to a point, but no promises.

openHAB for two or more home

I started exploring openHAB for my home automation. Looks to be a great application for the home automation. I want to automate two homes and want to run openHAB on one centrally placed server. Is it possible to segregate the data for my two homes and provide use based access for two homes.
Or I will have to have to instances running on my server.
Please suggest if anyone has done this earlier.
you can (I believe) provide different sitemaps, but the most importing question is how will a central openhab instance communicate with the "other" home?
Especially If you're going to use bindungs which require a piece of hardware like z-wave etc.
You can potentially play with MQTT and have a small Raspberry Pi running in the "other" home feeding the MQTT.
Assuming that there is not a hardware or range based issue with using OpenHab for two homes (e.g. z-wave USB dongle but second home is out of range) and there is network connectivity between the two houses, there are a number of ways you can accomplish this. Here is one.
The easiest would probably be to just use a naming convention for your items and groups to easily tell which house the item comes from. You would probably want to set up a separate sitemap for each house as well. If I understand your question this should segregate the data for you based on name and provide use based access for each home.
If you want to segregate the data even more thoroughly you can configure your persistence to save all the items from one house to one DB and all the others to a different one, though you will need two different persistence bindings set up (i.e. one uses rrd4j and another uses db4o). I'm not sure this provides any advantage.
The final step is getting the data from the remote house into openHab. How this is accomplished will depend on the nature of the sensors and triggers in the other house. You can use the HTTP binding, TCP/IP binding, or an MQTT broker. I've personally exposed a couple of my Raspberry Pi based sensors to openHAB using a python script and the paho library that publishes the sensor data read from the GPIO pins to an MQTT broker and it works great.
Centralization vs segregation - you have to decide, which one has more advantages and less risk.
The two houses will store data on the server (openhab2, mqtt, DB/rr4d) and each one have access to it - that must be clarified.
The network connectivity is obvious, it should be stable between the two sites. Security is another issue - not only digital, but safety of life (hvac controlling or safety appliances with network outage?).
Configuration is pretty supported in both ways, separate config files (items, rules, persistence, etc) and connectivity in a hierarchy has endless approaches and capabilities.
In the newest version of the android app you can add multiple openhab servers. Why don't just use two instances of openHAB?

Need a free Application for network monitoring, traffic per port, and a weekly report

I would like to know if there's an open source application that can:
-Being open-source (obviously free, no cost at all)
-Check which ports are being used and check the bandwith used by each of them.
-Based on requirements above create a weekly report. With details of each prt per day and time specifications.
I have read about Ethereal for the Network Monitoring and JasperReports for the Report-creation-stage, but haven't gone much on details yet..
If my specifications cannot be met with a free app then I would like to say that I could work with Java to check which ports are being used, but I still don't know if Java could handle ALL the requirements... please, I would really like to have an answer for that.. Because I could start working on it right now but I want to be sure Java can have everything covered.
PD: If Java can't be a solution what would you suggest?
suggestions for you:
Colasoft Capsa Free: http://www.colasoft.com
Spiceworks: new user, cannot give link.
Or google: free traffic monitor

Resources