Warding Off Winter With TI CC2650, Beaglebone Black, and a Nest Thermostat

Winter is coming, and the Northeastern United States where I live can get cold! And the houses are often old and drafty. So I built a wireless sensor system tied to a Nest thermostat to keep me cozy, wherever I am in the house, instead of keeping my thermostat cozy, bolted to its wall.

I did this with Texas Instruments’ Sensortags, a Beaglebone Black, DeviceHive for device registration and management, and a Nest thermostat.

This one was coded up using Python to speak Bluetooth LE to the Sensortags, and JavaScript on Node.js to do the device registration and telemetry messaging.

Yeah, they call it Bluetooth Smart now, but I don’t like that name! LE for me. 😉

DeviceHive managed the device registration for Sensortags, to make it easy to add new tags and wrangle them, and the wot.io data service exchange piped the messaging to scriptr for some transform logic in a convenient cloud service. bip.io did visualization and Nest control, and Circonus handled data logging and analytics.

Read the articles here:

http://labs.wot.io/ship-iot-with-beaglebone-black-ti-sensortags-and-devicehive/

…and here:

http://labs.wot.io/ship-iot-with-beaglebone-black-ti-sensortags-and-devicehive-part-2/

And the code is on GitHub!

This is the video I made to go along with the rest:

 

Delicious Coffee with a Kinoma Create and PubNub

Update: I have a forthcoming three-part series of articles on Texas Instruments’ e2e blog that dives into some of the hardware engineering behind this, too. I’ll link them here when it’s finally published!

Updated Update, here are the links:
Part 1
Part 2

I’m a coffee snob. I admit it. I’m proud of it. Although, upon reflection I really think it boils down to appreciating deliciousness. And who wouldn’t?

kinoma-wotio-wmf15-tg-115537

Anyhow, I was able to weave my love of coffee together with another Internet of Things demonstration project, this time coding JavaScript and XML on a Kinoma Create, and building a Pyrex-cased temperature probe with a Texas Instruments LM-35 military-grade analog temperature sensor. The Kinoma reads the temperature sensor, and asynchronously publishes telemetry messages to PubNub. From there, the wot.io data service exchange subscribed to the messages and routed them to a number of data services I used for analytics and alerting, including scriptr.io, bip.io, and Circonus. As with all these prototypes, it was up and running very fast, and I could iterate rapidly as new insights formed. Powerful stuff, that can really hit your bottom line and time-to-market in good ways.

I discovered my drip coffee maker sucks at temperature regulation, and my pourover and French press technique has improved quite a bit. So has the coffee!

You can read the full article I wrote here. And the code is on GitHub, as well.

This setup was at World Maker Faire 2015 in New York City, at the Kinoma booth. Cool to have some of my creations featured there =)

 

How to Transfer your OTR Private Key Between Hosts

OTR is a good protocol for encrypted chat. I use it often, but have always wondered how to transfer my OTR private keys between computers and devices. Up until now, every machine would have a separately-generated key, and I’d have to re-verify keys with my friends before chatting. Always verify your keys! Today I finally looked around for where the keys are stored, on a Mac using Adium, and on a Debian Linux box using Pidgin.

The Pidgin OTR keys were easy to find. Pidgin is run under the hood by libpurple, and there was simply a .purple directory in my home dir. In here we find the OTR files:

[text highlight=”6,7,8″]
accounts.xml
blist.xml
certificates/
icons/
logs/
otr.fingerprints
otr.instance_tags
otr.private_key
plugins/
prefs.xml
smileys/
status.xml
[/text]

Inside the otr.private_key file we see the keys in a parenthesis-delimited data structure (how (lispy!)), which are set per-account (this is not my actual key, of course):

[text highlight=”5,6,7,8,9,10,11,12,13″]
(privkeys
(account
(name your_username_here)
(protocol prpl-aim)
(private-key
(dsa
(p #9AD61CB50561A45116DC9735ED1DAABA372308628ABDCA1E92B7283189B10945DAB0D20594E4DE5E92B2334635208D78D17371D6012426A347C831B89D5EA7D2CED4CAD0D5DADA46DCCC13E6CB436324E226D68DBB7165BE69BDCCE667B59AD47423C586A8700D47BB0821D1BB8086E73073DBA847AE358B0231D3A9BC112A96358#)
(q #ECAD3933491A19B6A4170EE921D480B0AB736E244C1B0#)
(g #9AD61CB50561A45116DC9735ED1DAABA372308628ABDCA1E92B7283189B10945DAB0D20594E4DE5E92B2334635208D78D17371D6012426A347C831B89D5EA7D2CED4CAD0D5DADA46DCCC13E6CB436324E226D68DBB7165BE69BDCCE667B59AD47423C586A8700D47BB0821D1BB8086E73073DBA847AE358B0231D3A9BC112A96358#)
(y #9AD61CB50561A45116DC9735ED1DAABA372308628ABDCA1E92B7283189B10945DAB0D20594E4DE5E92B2334635208D78D17371D6012426A347C831B89D5EA7D2CED4CAD0D5DADA46DCCC13E6CB436324E226D68DBB7165BE69BDCCE667B59AD47423C586A8700D47BB0821D1BB8086E73073DBA847AE358B0231D3A9BC112A96358#)
(x #ECAD3933491A19B6A4170EE921D480B0AB736E244C1B0#)
)
)
)
)
[/text]

The bit inside of the private-key section is the bit you want to transfer.

On the Mac, using Adium, you can find your OTR key files in the directory ~/Users/username/Library/Application Support/Adium 2.0/Users/Default but the contents are slightly different. Adium names the protocls differently. This is why it’s best to transfer the private-key section of the file only, and make sure you copy from and to the intended protocols.

You can also see the otr.fingerprints file, which has the list of verified (or simply seen) fingerprints for your friends. Copy that around also, and you won’t have to re-verify keys! You could also do this on your mobile devices in theory, but that’s going to vary widely, so good luck and have fun!

Make sure you don’t mess up the file access permissions during this process – you wouldn’t want someone to steal your keys! chmod 600 is probably what you want, but security is complex, so don’t take my word for your setup.

 

Novena Laptop Linux Samsung 840 EVO Hard Drive Temperature Sensor Fix

The Novena laptop’s Linux build doesn’t support the Samsung 840 EVO hard drive that it ships with. While this isn’t a problem specific to the Novena by any means, it will affect the users. It’s a simple fix, just edit the hddtemp.db file and add the proper ID string and temp code to it. You can just echo-append it, or if you are fussy open it in an editor and put it neatly in the Samsung section!

[bash]
echo ‘"Samsung SSD 840 EVO 250G B" 190 C "Samsung SSD 840 EVO 250GB"’ >> /etc/hddtemp.db
^—that space needs to be there; see below
[/bash]

If you want to find how to add this to hddtemp.db yourself, or on a different model drive that’s not in the db file, use smartctl and hddtemp --debug. The -A flag on smartctl lists the drive’s attributes. Look for the ID# for the temperature sensor. It also may indicate Celsius or Fahrenheit, but more likely it will be Celcius. Here we can see it’s attribute number 190, and based on the name, it’s in Celsius.

[text]
root@novena:~# smartctl -A /dev/sda
smartctl 6.4 2014-10-07 r4002 [armv7l-linux-3.19.0-00268-g04e9d08] (local build)
Copyright (C) 2002-14, Bruce Allen, Christian Franke, www.smartmontools.org

=== START OF READ SMART DATA SECTION ===
SMART Attributes Data Structure revision number: 1
Vendor Specific SMART Attributes with Thresholds:
ID# ATTRIBUTE_NAME FLAG VALUE WORST THRESH TYPE UPDATED WHEN_FAILED RAW_VALUE
5 Reallocated_Sector_Ct 0x0033 100 100 010 Pre-fail Always – 0
9 Power_On_Hours 0x0032 099 099 000 Old_age Always – 169
12 Power_Cycle_Count 0x0032 099 099 000 Old_age Always – 12
177 Wear_Leveling_Count 0x0013 100 100 000 Pre-fail Always – 0
179 Used_Rsvd_Blk_Cnt_Tot 0x0013 100 100 010 Pre-fail Always – 0
181 Program_Fail_Cnt_Total 0x0032 100 100 010 Old_age Always – 0
182 Erase_Fail_Count_Total 0x0032 100 100 010 Old_age Always – 0
183 Runtime_Bad_Block 0x0013 100 100 010 Pre-fail Always – 0
187 Uncorrectable_Error_Cnt 0x0032 100 100 000 Old_age Always – 0
190 Airflow_Temperature_Cel 0x0032 067 064 000 Old_age Always – 33
195 ECC_Error_Rate 0x001a 200 200 000 Old_age Always – 0
199 CRC_Error_Count 0x003e 100 100 000 Old_age Always – 0
235 POR_Recovery_Count 0x0012 099 099 000 Old_age Always – 8
241 Total_LBAs_Written 0x0032 099 099 000 Old_age Always – 39490266
[/text]

You can get the drive’s name with hddtemp --debug and also make an educated guess as to the sensor ID#. Notice the space in “250G B” below – you’ll want to use this exact model string in the db file. Not sure precisely why that space should be there, but from the hddtemp source it seems that’s how the drive identifies itself via SATA command. Perhaps other utils like smartctl get the name in other ways, or fix it up. I don’t have time to dig further on that…

[text]
root@novena:~# hddtemp –debug /dev/sda

================= hddtemp 0.3-beta15 ==================
Model: Samsung SSD 840 EVO 250G B �@

field(5) = 0
field(9) = 170
field(12) = 12
field(177) = 0
field(179) = 0
field(181) = 0
field(182) = 0
field(183) = 0
field(187) = 0
field(190) = 34
field(195) = 0
field(199) = 0
field(235) = 8
field(241) = 50

If one of the field value seems to match the temperature, be sure to read
the hddtemp man page before sending a report (section REPORT). Thanks.
[/text]

 

pg_dump and pg_restore are Slow? Try this.

A friend was trying to import about 300K rows that were dumped with pg_dump. Trouble was, it would barf saying “Invalid Command \N” and never work. The attempted work-around involved telling pg_dump to use discrete INSERT statements per row, but this made it so slow it would have taken nearly 3 days, we guessed.

The actual error was above the Invalid Command messages, but scrolled away. Was able to see it like this:

[code]
[postgres@box]$ psql < import_me.sql 2>&1 | head -n 20
[/code]

Not sure if I needed the STDERR redirect there, but it seemed sensible and worked so I didn’t investigate further!

We ended up fixing the actual problem causing the Invalid Command messages, which involved pre-creating various bits of schema, relations, roles, etc., and using a data-only dump during The Big Import. The error was kicking the COPY FROM STDIN command to malfunction, and it tried to interpret the rows as SQL statements, which they are not.

Moral of the story, use COPY FROM and not INSERTs. There are plenty of other optimizations, turn off indexes and add them back at the end, and make sure your pg tuning is in order.