Random Species Name Generation using RNN

Last week I decided to do something fun with AI. I trained a small Recurrent Neural Network to generate random names of species as per the Binomial nomenclature (e.g Homo Sapiens) using GRU based RNN and Tensorflow. The name of species for training was taken from Uniprot’s website : https://www.uniprot.org/docs/speclist

Click on the ‘Generate Species’ button to see it in action

For code , visit here:
https://www.kaggle.com/finalepoch/species-name-generation-using-tensorflow

Running your AI models the serverless way

As a ML practitioner, I am always on a lookout for cheap and easy ways for hosting my model. Google Cloud Functions can be a great option in such cases. In this article I will discuss some advantages of hosting your models for inference using Google Cloud Functions.

Why Serverless / Google Cloud Functions

  • Easy to deploy, all you need to do is basically write a function that performs predictions.
  • Easy and automatic scalability.
  • You get billed only when the function is called (great for hobby projects and demos !)
  • Multiple triggers. You can trigger your AI model in multiple ways (not just via REST endpoint). This allows for creating complex AI based pipelines. At the time of writing this article following triggers were available:
    • HTTP
    • Cloud Storage
    • Cloud Pub/Sub
    • Cloud Firestore
    • Firebase (Realtime Database, Storage, Analytics, Auth)
    • Stackdriver Logging
  • Detailed performance and usage reports. Most serverless offerings from major cloud providers come with great support for capturing and displaying metrics
    around your models performance.
    Function dashboard in Google Cloud

Tips to host your function

Tip 1: Save money by choosing appropriate RAM

To save money you always will want to run your functions at the lowest memory environment possible. If you know the exact amount of memory your code requires, making a choice is simple. If you are not sure, start with a lower memory container and determine the best fit using trial and error.

Tip 2: If possible bundle your model and dependencies with your code

If your model is small you should consider bundling it with the function code, along with any other things that may be required for running the model like weights, vectorizers, dictionaries etc. Though the preferred place to keep this data is a cloud storage bucket, you can save on some precious time by bundling such dependencies with your code.

Tip 3: You can benefit from cold start and using global variables

Basically a serverless function is stateless. That means you end up repeating each and every step required to make predictions every time the function is invoked. Luckily, there is a ‘state’ aspect to cloud functions as mentioned here. What this means is that depending upon the load, multiple calls to a function can be directed to the same instance of the function. In such cases any global variables which have been loaded are preserved and their value is available inside the actual function. These global variables can be used to hold models and results of any task which is computationally expensive.

Finding and analyzing popular TV shows on Netflix using topic modelling in Python

This video contains results of a datascience experiment which I performed on tweets containing the word ‘Netflix’

I used Twitter Streaming API to collect around 272300 tweets in a duration of 24 hours starting July 14 2020. I wrote small piece of code using tweepy python library and stored them to a sqlite database.

I fitted a Latent Dirichlet Allocation model to extract 25 topics from a random subset of tweets. I manually reviewed important keywords associated with each topic and shortlisted topics which were related to TV shows. Then, I used this fitted LDA model to tag every tweet in the dataset with a topic name.

Following Python libraries/tools were used for the above project:

  • geopy
  • tweepy
  • scikit-learn
  • numpy
  • pandas
  • Sqlite
  • Plotly
  • GCP’s Geocoding API
  • Twitter’s streaming API

Motion detection in microscope images using Python and OpenCV

There are three types of organisms one can see in above video, the round one, the long one and couple of amobea

Last year, I was introduced to a wonderful scientific instrument called the ‘Foldscope’. I have spent hours observing things with it. One of my favorite past times is to observe ciliates using the Foldscope. Ciliates are very simple single cell organisms which are easy to find and come in numerous shapes and sizes. Most ciliates move very fast, and you need some skill with a microscope to follow them on the slide. This inspired me to write some code that could detect moving objects in a video and draw rectangles around them. Amazingly, I believe I was able to do a decent job with under 60 lines of python code
(https://github.com/code2k13/motiondetection )

In this post I will discuss concepts which I used for detecting moving objects and how the work together to come up with the end results.

Reading video with Python and OpenCV

The first thing we need to do is be able to load frames one by one from a video. OpenCV makes this task very easy. OpenCV has a very convenient function called ‘cv2.VideoCapture’ which allows returns an object that can be used to find out information about the video (like width, height,frame rate). The same object allows us to read a single frame from the video by calling ‘read()’ method on it. The ‘read()’ method returns two values, a boolean indicating success of the operation and the frame as an image.

1
2
3
4
5
cap = cv2.VideoCapture("video_input.mp4") 
if cap.isOpened():
width = cap.get(3) # float
height = cap.get(4) # float
fps = cap.get(cv2.CAP_PROP_FPS)

The full video can be read frame by frame using following code:

1
2
3
4
5
success = True
while success:
success,im = cap.read()
if not success:
break

Writing videos using OpenCV

Writing videos with OpenCV is also very easy. Similar to ‘VideoCapture’ function, the ‘VideoWriter’ function can be used to write video , frame by frame. This function expects path of output image, code information,frames per second, width and height of output image as parameters

1
2
fourcc = cv2.VideoWriter_fourcc('m','p','4','v')
out = cv2.VideoWriter('video_output.mp4',fourcc , int(fps), (int(width),int(height)))

Write a frame to the video is as easy as calling

1
out.write(image_obj)

Finding frame difference

The above video was generated out of frame diffs from the original video

Images are represented as matrices in the memory. OpenCV has a function called ‘cv2.absdiff()’ which can be used to calculated absolute difference of two images. This is the basis of our motion detection. We are relying on the fact that when something in the video moves it will absdiff to be non zero for those pixels. However if something stationary and has not moved in two consequitve frames, the absdiff will be zero. So, as we read the video frame by frame, we compare current frame with older frame and calculate absdiff matrix. The dimensions of this matrix are same as of the images to be compared.

Sounds easy , right ? But there are some problems with this approach. Firstly, cameras and software produce artifacts when then capture and encode videos. Such artifacts give us non-zero diff even when the object is stationary . Uneven lighting and focussing can also cause non-zero diffs for stationary portions of images.

After experimenting with some approaches, I found out that thresholding the diff image using mean values works very well.

1
2
frame_diff[frame_diff < frame_diff[np.nonzero(frame_diff)].mean()] = 0
frame_diff[frame_diff > frame_diff[np.nonzero(frame_diff)].mean()]

Using edge detection to improve accuracy

Edge detection using ‘Sobel’ filter performed on the frame diff video

As microorganisms move, they push matter around them, which gives positive pixels after diffing. But we want to differentiate micro-organisms from other things. Aslo focussing plays an important part here. Generally a lot of out of focus moving objects will also give positive frame differences. Mostly these are blurred objects which we simply want to ignore. This is where edge detection comes into play on edges and borders in the image. To find those we use edge detection. This can be easily achieved by using ‘sobel’ filter from scikit learn package.

1
output = filters.sobel(frame_diff)

Using contour detection to detect objects

Video generated after performing contour detection on the Sobel filter output

Most protozoans like ciliates will not always show a clear border (because they are mostly transparent). So when we use edge detection to detect shapes/outlines of moving objects, we get broken edges. In my experience contour detection works very well to group such broken edges and generated a more continuous border.

1
contours, hierarchy = cv2.findContours(thresh,cv2.RETR_TREE,cv2.CHAIN_APPROX_SIMPLE)

OpenCV has built-in functions to find contours. The best part is, the function is able to find nested contour structures and return a hirearchy. The ‘cv2.findCountours’ function returns heirarchy of contours. We only consider top level contours (who dont have a parent). If for a contour ‘idx’ , if ‘hierarchy[0][idx][3]’ returns -1, it means that it is a top level contour and it does not have any parent. Everything else we ignore.

Creating bounding boxes

Video showing bounding boxes drawn over contours

Creating boxes around countours can require a bit of math. Luckily OpenCV has a convinient function ‘cv2.boundingRect’ which returns center coordinates, with and height of bounding rect around a given contour.Once we have that, drawing a rectangle on our frame can simply be down using cv2.rectangle function. We can pass the color and border-width when drawing the rectangle to this function.

1
2
3
4
5
if hierarchy[0][idx][3]== -1 :
x,y,w,h = cv2.boundingRect(contour)
if (w*h <= (q)**2):
continue
image = cv2.rectangle(image, (x,y),(x+w,y+h), color, 1)

The concept of ‘q’

Like I explained earlier, videos taken using a microscope can be messy. There can a lot going on. We may be only interested in detecting objects of a certain size. This where I have introduced a parameter called ‘q’. This parameter was used for altering settings for various filters I experimented with. Currently this is only used to filter out bounding rects which are smaller than q^2 in area. You should experiment with different values of ‘q’ , depending the resolution of your video and size of objects you are interested in.

Things to improve

I want to make this approach fast enough so that it can run in realtime. Also it would be nice if I can get this ported to android or a mobile phone. I also plan to experiment with ML based segmentation techniques for better detection.

The full code

The full code , along with sample video is available on Github:
https://github.com/code2k13/motiondetection

Stomata seen through a microscope

Like everyone else I studied ‘stomata’ when I was in school, however never got to see those till I purchased a Foldscope. The Foldscope is an amazing instrument and can be a great teaching aid in classrooms. Capturing stomata images can be very tricky. You need to be able to take an epidermal peel from lower side of a leaf, which isn’t easy and depends upon the plant you choose for observation.

Here are some images of leaves and stomata that I was able to capture using Foldscope

Stomata From Onion Leaf

Stomata from Fenugreek Leaf

Zoom into the images and you should see stomata at the border of irregular cell walls

Stomata Using Dark Field

Stomata can be very clearly seen using dark field observation techniques. Using some digital zoom on my camera I was able to resolve more details, you can almost kind of see the gaurd cells.

Pollen Grains seen through a microscope

Pollen grains are some of the easiest and most interesting objects for observing under a microscope. I have observed many different types of pollen grains using my Foldscope. Here is a small summary of my findings:

Pumpkin Pollen

Pumpkin pollen grains are somewhat bigger in size. Using a foldscope you can actually see spikes on the pollen grains very clearly.

Gerbera Flower Pollen

The Gerbera flower pollen has very interesting looking shape

Lotus Flower Pollen

The pollen grains of a Lotus flower look absolutely beautiful.

Sweet William Flower

Pollen Grains of Unknown Flower

These have very interesting looking shape. You can clearly see a pattern on the surface of each grain !

Marigold Flower Pollen

Pollen grains of another common flower

Spider Lily Pollen

Beautiful images of pollen grains from a Spider Lily flower. Plastic adhesive tape was used to create the Foldscope slide used to observe the pollen grains. Looks like some chemicals in the adhesive substance cause coloured liquid to ooze out from the flower.

Bougainvillea Pollen Grains

Disk like pollen grains with a slight depression on one side can be clearly seen if you zoom the image.

Student's Manual of Job Interviews

📘Read the ebook from Amazon.in
📘Read the ebook from Amazon.com

Last week I published a small book on Amazon titled :

Student’s Manual of Job Interviews: Advice on cracking job interviews for technology students

Frankly speaking there are many books on resume and job interview preparation. This book however is specifically written for technology students, engineers and graduates. As a technology student, resume preparation and handling interviews can be tricky. Also, when you are a ‘fresher’, you have little or zero work experience. Hence, generic advice on resume creation and handling job interviews might be insufficient.

This book was born out of sincere desire to help technology students to get their first job. It is basically a guide that contains dos and don’ts when preparing for a job interview. The contents of the book come from my 17 years of work experience as a developer/architect in the software industry, which includes interviewing students and working with interns.

This book will teach how to:

  • Prepare a great resume.
  • Stand out as unique (within your college and the job market) !
  • Plan and develop a skill set which suits your interests.
  • Use your education, project work and extracurricular activities to your advantage during an interview.
  • Avoid common mistakes during technical and HR interviews
  • Give up on certain misconceptions which most ‘freshers’ have.
  • Handle rejection and retry !

I hope you find this book useful !

Creating Anti-Squish Slides for observing insects using Foldscope

Introduction

Every since I got my hands on a Foldscope, I have been pretty much obsessed with it. I have used the Foldscope to see and discover many interesting things. One fun activity, I am sure every Foldscope user must have tried, is observing insects using a Foldscope.

If you have ever tried observing insects you might have realized that, there is a tendency of insects getting ‘squished’ due the magnets and foccusing mechanism of the Foldscope. I wanted to find a way to capture live and moving insects using Foldscope. The below write-up discusses in-detail a method of creating a ‘anti-squish’ slide for foldscope and observing insects using it.

Items needed

  • Transparent nail polish (or some sort of transparent super glue).
  • Glass slide
  • Cover slip
  • Plastic thread (something similar to one used for attaching price tags to apparels in malls and super markets).
  • Bits of white paper
  • Some patience !!

items required to create the anti-squish slide

Creating the ‘anti-squish’ slide

Air-gap created after gluing the thread and slide
One side is kept open so that specimen can be inserted

Actually the idea is very simple. We want to glue the thread between the cover slip and the slide, along the borders of cover slip, such that a cavity or air-gap is created between the slide and cover slip.

For my experiment, I used a rectangular coverslip because it is easiler to bend the thread along the edges of coverslip and gule it.

Only glue three sides of the coverslip to the slide. We will leave one side open as that will allow us to push small insects in the small ‘glass-box’ we created.

I used transparent nail-polish as a glue, but using a commercial transparent super-glue should work much better in my opinion.

It must be noted that the air-gap needs to be very small, which means the plastic thread used to create the gap should be kind of thin. This results in a limitation that only very small insects (smaller than ants) can be viewed. You may be tempted to use a much thicker thread to create more head-room, but I should caution you that you may struggle with foccusing. However, if you are able to design something that allows larger insects like ants to be observed please let me know !!

Observing insects using this slide

Using paper bits for scattering light
Ok, this is the fun part. This is what I typically do:

  • Use a pointed object like needle or toothpick to transfer the insect or larvae on the glass slide first (near the open end).

  • Now use a long strip of paper and gently push the spicemen inside the cavity we created. Be gentle and try not to harm the insect in process.

  • Push the specimen all the way inside towards the sealed walls. Since we have left one side of our glass-box open, there is a tendency of the cover slip bending near the open slide and ‘squishing’ the specimen, when we try to focus using our focussing mechanism of Foldscope.

  • I have also tried to insert very tiny bits of papers inside the slide surrounding the specimen. It serves two purposes:

    • It puts limitations on movement of the insect. Also provides required support to the insect to re-orient itself and possibly expose different organs as it tries to moves through the bits of papers.

    • It allows for scattering of light. This is very important for us to be able to see all sides of the insects

  • Use a strong LED light source for illuminating the specimen.

Doing more

Though the primary intention of this technique was to observe insects, there are many cool things you can try to observe using this technique, like :

  • You can culture things like fungi and moss inside the slide and monitor their growth.

  • You can try to observe various stages of an insect’s lifecycle (eggs –> larvae –> insect).

  • Observe how insects feed. How they interact with each other.

  • Germinating seeds can be observed. It would be fun to see cotelydons,roots and shoots !!

Finally

Hope you found my experiment intresting. If you decide to make this type of slide, please do share your findings with me and the Foldscope community.

Diwali decoration using Arduino and WS2812B LED strip

Since a long time I wanted to create something cool using LEDs and Arduino for Diwali. For those who don’t know, ‘Diwali’ is known as ‘festival of lights’ in India. During Diwali you will see houses and buildings decorated with earthen lamps (called ‘diyas’), lanterns and LED lights. Shown below is the cool programmable color changing lantern decoration which created using Arduino.

The lanterns display in daylight(left) and at night (right)

What I used

  • Arduino Uno (or clone)
  • Led strip (WS2812b 5v) x 1 meter
  • Small Diwali lantern x 4 [the translucent ones, which can be fitted with a small bulb]
  • Plastic string or wire to tie the lanterns one below another.

Construction

Construction is simple but is very important to select the right lanterns. In India, small golden or silver colored lanterns are easily available and are cheap. But these are not translucent (they are intended for daytime decoration). If you search in the shops, you will find translucent, similar sized lanterns (basically these are the smallest sized replicas of common lanterns). Given below is the photo of a correct type of lantern.

I used small sized translucent lanterns.

Closeup of the LED strip.

I tied all the four lanterns one below the other. There can be many ways to do this. I used thin plastic rope to tie-up the lanterns together. These lanterns have inner plastic ring with spokes at the top, which can be used to tie the string/rope. I ensured that the rope is not easily visible and mostly hidden when suspended. Also the height of my lantern chain when suspended had to be be roughly equal to one meter. In my case, I decided to only use 28 out of 30 LEDs on the strip. Once the lanterns were tied and a chain is created, next step was to fix the LED strip inside the lantern. I simply tied one end of the strip to the inner ring of the topmost lantern and suspend the remaining portion of strip into the ‘hollow’ of the lantern chain. Interfacing the LED strip was very simple. The strip has three wires which I connected as follows :

LED stripArduino
DataPD3
VCC5V
GNDGND

Important

Lot of articles on the Internet advocate using a separate power supply for the LED strip. I managed to get it work by directly connecting it to my board (I was using Arduino clone). Maybe because, I only had 1 meter long strip. For anything longer please consider adding a separate power supply.

Once I was done with my hardware, it was time to connect the strip to my Arduino board and write some code.

The Software

To work with the LED strip you need to FastLED library for Arduino. The library comes with cool examples and descent documentation. Initially I wrote a few programs for the LED strip using basic RGB color manipulation (tried to play with random values with RGB). I soon realized that this does not produce a nice visual effect. The best way to produce visual effects with distinguishable colors was to use FastLED HSV colors

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
#include "FastLED.h"

FASTLED_USING_NAMESPACE

#if FASTLED_VERSION < 3001000
#error "Requires FastLED 3.1 or later; check github for latest code."
#endif

#define DATA_PIN 3
#define LED_TYPE WS2811
#define COLOR_ORDER GRB
#define NUM_LEDS 28
#define BRIGHTNESS 96
#define MIN_VAL 50
#define MAX_VAL 255

CRGB leds[NUM_LEDS];

void setup() {
delay(3000);
FastLED.addLeds<LED_TYPE, DATA_PIN, COLOR_ORDER>(leds, NUM_LEDS).setCorrection(TypicalLEDStrip);
FastLED.setBrightness(BRIGHTNESS);
}

void loop()
{
byte old_h = 0;
byte h = 0;
while (true) {

while (old_h == h)
{
h = random(0, 8);
}

for (byte i = 0; i < 4; i++) {
set_latern_color(i, h * 32);
FastLED.show();
delay(2000);
}

old_h = h;
}
}

void set_latern_color(byte lantern_no, byte hue)
{
for (byte i = 0; i < 7; i = i + 1)
{
leds[lantern_no * 7 + i].setHue(hue);
}
}

Finally !

Hope you enjoyed reading this article. If you creating something similar, please feel free to share with me !

Creating a temperature & humidity tweet bot using IFTTT and ESP8266

Tweets from our tweet bot

In my previous articles, I had written in detail about creating wifi enabled temperature and humidity logger using ESP8266 + NodeMCU and configuring a NoSQL(CouchDB) database to store data transmitted by it. In this article, we are going make our IoT device tweet temperature and humidity readings using IFTTT. Once we are able to connect our device to IFTTT, it opens a plethora of interesting things we can do with our device. Being able to tweet temperature and humidity is just one of them.

What is IFTTT?

Block diagram illustrating role of IFTTT

IFTTT is a service which allows us to define actions for specific events. An example could be, ‘When I receive an email , send me an SMS’. IFTTT is made of ‘Channels’ and ‘Recipies’. Each ‘Channel’, supports certain actions. If you ask me, ‘Channels’ can be classified into two main categories. First type of ‘Channel’ basically poll or listen to a feed/stream and provide triggers. The other type of ‘Channel’, allows you to do something like, sending SMS, writing email or updating a document on Google Drive.

You create a Recipe using two Channels, one typically to listen to something (example: incoming sensor data) and the other to do something (example: post sensor data to Twitter). All this without writing a single line of code. A lot of people are of the opinion that IFTTT is the next big thing that is happening to the Internet.

Getting started with IFTTT

First, you should create an account on IFTTT. I strongly recommend that you familiarize yourself with IFTTT by creating few simple Recipes. Once you do that, head straight to the Maker Channel , which has been specially created to support DIY electronics. Once you ‘connect’ to this Channel, you should be able to see your ‘key’ on the Channel’s main page. This is a sceret key that should be used by your IoT device to send data over HTTPS to IFTTT. Everytime you send data using the key, the ‘Maker Channel’ is triggered. The idea is simple, we have to create a Recipe involving the ‘Maker’ and ‘Twitter’ Channel (If data received by Maker, then post it to Twitter).

Sending data to the ‘Maker’ Channel is simple. You need to send a HTTP POST request to below URL :

https://maker.ifttt.com/trigger/{event}/with/key/your_secret_key

with data inside the body of the request in JSON format shown below:

1
2
		
{ "value1" : "sensor_reading1", "value2" : "sensor_reading2", "value3" : "" }

One important thing to remmember is to replace {event} with a friendly name that corresponds to sensor or process sending data. For example, if we had two of temperature sensors, we could create two events called ‘sensor1_data_rcvd’ and ‘sensor2_data_rcvd’. When you create a Recipe involving the ‘Maker’ Channel, you need to input the name of event as parameter.

Creating a IFTTT Recipe receive for tweeting Temperature & Humidity

Now that we are familiar with IFTTT and ‘Maker’ Channel, let us create a Recipe which will post temperature and humidity to Twitter. For this you need to ‘connect’ to ‘Maker’ and ‘Twitter’ Channels (you need to have a Twitter account). Next, follow below steps to create a new Recipe.

  • Step1 Choose Channel Trigger Select the ‘Maker’ Channel.

  • Step2 Choose a Trigger Simply click on ‘Receive a Web Request’ link.

  • Step3 Complete Trigger Fields Enter ‘tempupdt’ as the event name. Our device will publish under this event name

  • Step4 Choose an Action Channel Select ‘Twitter’.

  • Step5 Choose an Action Click on ‘Post a tweet’ link.

  • Step6 In the ‘Tweet Text’ textbox enter the text as shown below

    Adding tweet text in Step 6

  • Step7: Click on ‘Create Action’ and we are done.

Now we have successfully created a Recipe which will post temperature and humidity readings to Twitter.

Programing the IoT device (temperature and humidity logger)

The IoT device is a temperature and humidity logger (ESP8266 running NodeMCU and DHT11 sensor) which I had described in my previous article. So I would not be discussing details about it’s construction and setup required to program it. In this section we will discuss in detail about the code part. One important thing to note is, we need to make HTTPS request to send data to IFTTT. For this we need to get latest version of NodeMCU firmware. We need NodeMCU v1.5.1 for this. Older versions of the firmware could be directly downloaded from their site and burned into ESP8266. However starting v1.5.1 , pre-built binaries are not provided.


Source : https://github.com/nodemcu/nodemcu-firmware

This means either you have to build it from source (the last thing I want to do) or use their ‘custom-build’ service.

Luckily their custom-build service is super cool and allows us to create a custom build with only the components which we want. When creating a custom build, please ensure that HTTP, Crypto and DHT modules are checked,in addition to default selections.

Make sure HTTP, Crypto and DHT are selected in your custom build.

Once you download a custom build of NodeMCU v1.5.1, please follow instructions mentioned here to burn it to ESP8266. After that is done you will need to connect the DHT11 sensor and upload the following script to ESP8266

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
		
wifi_nwid = "wifi_network_id"
wifi_pwd ="wifi_password"
interval = 30000
---------------------------------
iscon = false
temp_old = 0.0
humi_old = 0.0

function checkConnection()
if wifi.sta.getip() ~= nil then
iscon = true
else
iscon = false
print('> Wifi error. Retrying')
wifi.sta.connect()
end
end

function get_payload()
status,temp,humi = dht.read11(4)

if( status == dht.OK ) then
if(temp_old == temp) and (math.abs(humi_old - humi) < 5) then
return nil
end
temp_old = temp
humi_old = humi
return string.format("{\"value1\": \"%d\",\"value2\":\"%d\"}",
math.floor(temp),math.floor(humi))
end

return nil
end

function post_values()
pl = get_payload()
if pl then
http.post('https://maker.ifttt.com/trigger/tempupd/with/key/{your_secret_key}',
'Content-Type: application/json\r\n',
pl,
function(code, data)
if (code < 0) then
print("HTTP ERROR")
else
print("SENT.")
end
end)
end
end

function run()
checkConnection()
if iscon == true then
post_values()
end
tmr.alarm(0, interval, 0, function()
run()
end )

end

print('> Booting..')
wifi.setmode(wifi.STATION)
wifi.sta.config(wifi_nwid, wifi_pwd)
wifi.sta.connect()

In the above code please remember to change values of wifi_nw_id and wifi_pwd variables before uploading. Also please substitute {your_secret_key} with actual secret key of your ‘Maker’ Channel as described in the earlier section

Things to consider when posting to Twitter

When posting to twitter we need to be aware of daily limits on tweets. Our IoT device an in practice can generate lot of tweets. When I first programmed the device, it would give a reading every 30 seconds. Then I thought of changing the code so that it posted the data from sensor every five minutes. However,a much better way was to make our device tweet only when there is 1 degree Celcius or more change in temperature or 10% or more fluctuation in humidity readings. This way we would be able to tweet more meaningful data. Also with this approach, I believe our tweets should stay under daily limit.

Doing more with IFTTT

Tweeting temperature and humidity is just one of the many cool things we can do with IFTTT. We can also create a Recipe that will write temperature and humidity readings to a spreadsheet on Google Drive. Shown below is output from one such Recipe.

Top: Row format expression used in IFTTT Recipe to write temperature & humidity values to spreadsheet.Bottom: Output of the IFTTT Recipe; spreadsheet containing data.

Another interesting use would be to control air conditioner based on temperature reading.

One important thing to note here is how easy it was to make our device post data to IFTTT’s , ‘Maker’ Channel. All we had to do was to write some code that made a HTTP POST request. Once we did that, our device is ready to talk to scores of different services without having to configure anything at our end. This is the real power of IFTTT.

A word on IFTTT and IoT

I want to end this article with my thoughts on IFTTT. I think it is a great idea, certainly one of the next big things. In todays world building a platform from scratch is not an option. IFTTT is awesome, because it serves as a glue between many established software services and platforms. There are many consumer grade IoT products/appliances (wifi enabled plugs, smart lights, surveillance cameras..) which come with out of the box integration with IFTTT. In future (and even now), I believe such compatibility will be a key selling point for such items. If you are thinking of buying a smart appliances, make sure that it has a IFTTT Channel , if not then you should certainly be reading this article.