code: https://github.com/madewithlinux/home-monitoring-grafana
tcpdump
to capture network traffic for the app during login and viewing historic data for the device
curl
to experiment with the HTTP endpoints that you find
My phone already had
tcpdump
installed (thanks LineageOS!), but if yours doesn't have that, you can probably build and install it if it's rooted.
adb shell su -c 'tcpdump -s 0 -v -w /sdcard/out.pcap'
adb pull /sdcard/out.pcap out.pcap
(If the target android device is not rooted, you can maybe use mitmproxy on a separate device.)
wireshark does packet reassembly automatically. So really, you just need to put
http
in the filter box and start looking at requests and responses.
when you find something that you want to extract (like some response data), right click -> copy -> bytes -> printable text only
(demo time! maybe)
protip: pipe the output to
jq
to pretty-print it
curl -v --user-agent 'okhttp/2.7.5' \
'www.i-elitech.com//apiLoginAction.do?method=login&password=TEMPTOP_PASSWORD&username=veryrealemail@email_site.net' |jq
returns something like:
{
"success": true,
"result": "login",
"JSESSIONID": "JZ3OB4W9MD8TAW5W0Q6DHBTHR9IH2CGN",
"token": "JZ3OB4W9MD8TAW5W0Q6DHBTHR9IH2CGN",
"user": {
"email": "veryrealemail@email_site.net",
...
"token": "JZ3OB4W9MD8TAW5W0Q6DHBTHR9IH2CGN",
"username": "veryrealemail@email_site.net"
},
"androidVersion": 54,
"iosVersion": 10,
"informationId": 22
}
The android app sends the token as a cookie and a header. I'm pretty sure this is unnecessary, but I replicated it anyway.
curl -v --cookie 'JSESSIONID=JZ3OB4W9MD8TAW5W0Q6DHBTHR9IH2CGN' -H 'JSESSIONID: JZ3OB4W9MD8TAW5W0Q6DHBTHR9IH2CGN' --user-agent 'okhttp/2.7.5' \
'http://www.i-elitech.com//apiDeviceAction.do?method=getList&typeList=0&userId=7582' |jq
The important thing in this response is the device id. We use this to query the device data endpoint. (It also lists the names of each sensor, which I could use. But I only have one device that uses this API and it's sensors are very much not going to change ever, so I just hardcoded that in the script that eventually wrote.)
{
"success": true,
"total": 1,
"rows": [
{
"creatorName": "veryrealemail@email_site.net",
"deviceDepartmentId": 6,
"firstDataTime": {
...
},
"guid": "16972484243637555312",
"id": 38279,
"lastDataTime": {
...
},
"name": "M10i",
"online": true,
"probes": [
{
"currentValue": "0.3",
"deviceGuid": "16972484243637555312",
"deviceId": 38279,
"name": "HCHO",
"number": 1,
...
},
{
"currentValue": "1.0",
"deviceGuid": "16972484243637555312",
"deviceId": 38279,
"name": "PM2.5",
"number": 2,
...
},
{
"currentValue": "1.0",
"deviceGuid": "16972484243637555312",
"deviceId": 38279,
"name": "TVOC",
"number": 3,
...
},
{
"currentValue": "4.0",
"deviceGuid": "16972484243637555312",
"deviceId": 38279,
"name": "AQI",
"number": 4,
...
}
],
...
"wifi": true,
"wifiType": 0
}
],
"authorityMap": {
"ENERGY": 2,
"REAL_TIME_DATA": 0,
...
}
}
This is basically the same as the above process to get the devices list, except that here we can get all the data between two dates.
I used this API and queried for the last N minutes, instead of just periodically checking the most recent data for two reasons:
(side note: the API has what looks like a pagination-type-thing, in case we request too much data, but that doesn't seem to work. It just returns all the data between the two dates, even if that's more than the row limit you requested.)
use requests, and imitate the user-agent string that the android app uses. (I didn't test without this, but it's probably good to keep it consistent just to be safe.)
I saw a note on stackoverflow or somewhere, that said when you're scraping a website, it's a good idea to put an email address or some contact info in a HTTP header. I decided not to do this, though, since I already made an account with them. So if they need to send me a nastygram about how I'm using their service, they already have everything they need to do that.
basically just follow the guide at http://nilhcem.com/iot/home-monitoring-with-mqtt-influxdb-grafana
I had to change a few things in the docker compose file to get it to work, and that is included in my fork of the repo.