Weekly Machine Learning drop #13

I’ve become more and more interested in machine learning during last year. This is my way of collecting and sharing interesting reads on the topic I stumble upon. I publish those posts every Friday. They are divided into few categories and the format is constantly evolving.

News

In this part, I share interesting news from machine learning and artificial intelligence world. Those are mostly not very scientific articles about applications, predictions and controversies around ML.

Amazon’s move signals end of line for many cashiers
The recent Amazon’s acquisition of Whole Foods is another step for the company to dominate retail. When you put that together with their Amazon Go experiment, you may start thinking how grocery stores will look in the future.

Mars robot makes decisions on its own
NASA scientists installed 20’000 lines of code on the Mars Rover Curiosity to give it some intelligence. Now it can recognize which rocks are worth closer look, instead of beaming laser at any rock found.

Artificial intelligence beat human player in Dota 2
Another game with a set of complicated rules and a nonlinear way to win was beaten by an algorithm. This time it was a solution by Open AI, using solution based on reinforced learning.

 

DeepMind and Blizzard open Starcraft II as an AI research environment
On a similar topic – now Starcraft II also has an available platform for AI experiments. It contains, among others, an API and big (and constantly growing) set of recorded anonymized gameplays. The release contains also an open source version of DeepMind’s toolkit and access to mini games, that will allow training agents for specific tasks.

How Machine Learning is transforming drug creation
Machine learning algorithms that are good at pattern recognition, can go through new and existing genetic and medical information to find unknown previously connections, which will allow creating more targeted medication.

Tensorflow 1.3 released
The previous week, a new version of Tensorflow was released. Click through, to see the list of features and improvements.

Learning materials

This week I wanted to share with you some repositories with Tensorflow best practices and new Deep Learning course by Andrew Ng. 

Tensorflow tutorials
Speaking of Tensorflow, this GitHub repository contains a bunch of tutorials, that are simple and easy to use and grasp basics of the library.

Tensorflow best practices
And this repository contains a bunch of good practices while developing with Tensorflow.

New Deep Learning specialization from Andrew Ng
After his recent departure from Baidu and forming Deeplearning.ai, there is more news from Andrew Ng. He recently published new Coursera specialization focusing on Neural Networks and Deep Learning. Many people started their adventures in Machine Learning with his previous Coursera course, and this is definitely a great continuation. I’m in the 3rd week of the first course and with a clear conscience, I can recommend it to you.

Also recently, Andrew raised $150M venture capital fund to invest in AI.

This is it for today, thanks for reading. If you liked the post, let me know and please check other parts of the series.

Weekly Machine Learning drop #12

I’ve become more and more interested in machine learning during last year. This is my way of collecting and sharing interesting reads on the topic I stumble upon. I publish those posts every Friday. They are divided into few categories and the format is constantly evolving. Last few weeks were a bit hectic, due to hosting changes, but I’m back to regular posting.

News

In this part, I share interesting news from machine learning and artificial intelligence world. Those are mostly not very scientific articles about interesting applications, predictions and controversies that AI causes.

Algorithms aren’t racist. Your skin is just too dark.
In this article, Joy shares her story on how facial recognition algorithms fail to recognise darker skin tones. You can also watch it in her Ted talk. This issue doesn’t cause just minor problems, like cameras not finding somebody’s face. With a widespread use of facial recognition by law enforcements, it can put innocent people into trouble. It’s part of the bigger problem, that algorithms can inherit human’s biases. She also calls to action She wants to collect example cases of biased algorithms to find a way to fix the problem.

Is China outsmarting American in A.I.?
China is rapidly increasing its support for AI-related projects, while the US is decreasing government spending in that area. The article looks how those changes can impact future of the technology. The problem for China may be its traditional top-down management and lack of open information exchange culture. But those things are also changing.

Software is Eating the world, but AI will eat software
Nvidia CEO, Jensen Huang, shares his opinion about industries that will be impacted by AI developments. Apart from obvious examples like automotive or healthcare, paradoxically he mentions software.

Apple is working on a dedicated chip to run AI on devices
Just another company is building custom designed chips to accommodate new processing needs of machine learning algorithms. But in contrast to Microsoft or Google, whose chips power their data centres, Apple is rumoured to plan to put a dedicated AI chip in their devices.

The next big leap in AI could come from warehouse robots
Kindred is a company, that has a different approach towards AI usage. In opposition to most of the tech companies that focus on software and build chatbots or recommender systems, they believe that the true AI innovation will come in a physical form of robots.

Learning materials

This week I have a little bit less technical and a bit more visual content here. 

A visual introduction to machine learning
It’s a very nice visual presentation, which shows the process of building Machine Learning models. Starting with data analysis, through finding relevant features up to constructing the model. The algorithm used here is decision trees, which is pretty basic ML method but can be very effective with certain datasets. This looked like a website with a great educational potential, unfortunately, this “part 1” has no new follow ups.

A neural network playground
It’s another visualisation tool, that shows basic inner workings of neural networks. As input, you get to choose several datasets and you have control over the construction of the network. You get to set parameters like a number of hidden layers, a number of neurons or activation functions and see how they impact results. It puts a bit of light into rather mysterious ways, how neural networks work.

This is it for today, thanks for reading. If you liked the post, let me know and please check other parts of the series.

Weekly ML drop #11

I’ve become more and more interested in machine learning during last year. This is my way of collecting and sharing interesting reads on the topic I stumble upon. Those posts are published each Friday, they are divided into few categories and the format is constantly evolving. Last week I didn’t have time to prepare a weekly drop, so some “news” today may be a bit older.

News
In this part, I share interesting news from machine learning and artificial intelligence world. Those are mostly not very scientific articles about interesting applications,  predictions and controversies that AI causes.

Data is the new oil
In this very interesting article in The Economist, author bring arguments how in the 21st century, data will be (and already is) the main resource to fuel the economy (compared to 20th-century oil)

Sent to Prison by a Software Program’s Secret Algorithm
A man charged with fleeing police in a car was sentence 6 years in prison. One of the information used against him was set of bar charts analysing risks and threats he’s posing to the society. It wasn’t the main proof, but it is a little bit unsettling.

The parts of America most susceptible to automation
Short article discussion which areas of US and why will be most hit by the advent of automation. There’s interesting map there also.

Harnessing automation for a future that works
In this article, on the other hand, the author claims we’re not quite there, and we’ll need close cooperation between people and machines to progress automation.

6 areas of AI and ML to watch closely
If you’re interested what’s really hot in the industry, this article lists 6 most interesting technology areas.

Learning materials
Here I’m sharing material for learning ML that I found useful – online courses, blogs, books etc. This is usually rather technical stuff.

Deep learning simplified
Series of short videos explaining basic concepts of Machine Learning. Recommended rather for total beginners.

This is it for today, thanks for reading. If you liked the post, let me know and please check other parts of the series.

Phoenix, Ecto and time zones

This is another part of my series on learning Elixir.

In the previous episode, I calculated the distance of flight. I will use it later for gathering statistics. Another metric, I would like to have is a time of flight. Usually, flight departure and arrival times are presented in local time. So it gets a bit tricky to get the time length when the journey spans across few time zones.

Elixir’s standard DateTime library doesn’t work with time zones very well. The Internet suggests I should use Timex.

But first I needed to make some changes into my database because up until now, I had only flight date. A few weeks ago I wrote a post, how to update your schema with migrations and this time I followed the same steps. Still, I haven’t figure out how to do it in a bit more automated manner.

defmodule Flightlog.Repo.Migrations.CreateFlight do
  use Ecto.Migration

  def change do
    alter table(:flights) do
      modify :arrival_date, Timex.Ecto.DateTimeWithTimezone
      modify :departure_date, Timex.Ecto.DateTimeWithTimezone
    end
    
  end
end

As you can see, I used specific Timex types. This works only with Postgre, and if you want to use timezones it needs one additional step. You’ll have to add custom type to your database:

CREATE TYPE datetimetz AS (
    dt timestamptz,
    tz varchar
);

You can read more about using Timex with Ecto on this documentation page.

I also updated my flight.ex model. It looks like that right now:

defmodule Flightlog.Flight do
  use Flightlog.Web, :model

  schema "flights" do
    field :departure_date, Timex.Ecto.DateTimeWithTimezone
    field :arrival_date, Timex.Ecto.DateTimeWithTimezone
    field :flight_number, :string
    field :plane_type, :string
    field :from, :string
    field :to, :string

    timestamps()
  end

  @doc """
  Builds a changeset based on the `struct` and `params`.
  """
  def changeset(struct, params \\ %{}) do
    struct
    |> cast(params, [:departure_date, :arrival_date, :flight_number, :plane_type, :from, :to])
    |> validate_required([:departure_date, :arrival_date, :flight_number, :plane_type, :from, :to])
  end
end

After that, I walked along the path that’s proven to work in the flight distance part. I added a new function in my math.ex library, making use of Timex diff function:

    def flightTime(earlier, later) do
        hours = Timex.diff(later, earlier, :hours)
        minutes = rem(Timex.diff(later, earlier, :minutes), 60)
        "#{hours}:#{minutes}"
    end

And I’m calling it from the view in another function, so it’s easily accessible from the template:

  def time(time1, time2) do
    Flightlog.Math.flightTime(time1, time2)
  end

And that was it. Although it took me few hours because I was struggling a bit with Timex types. I didn’t read carefully the documentation and for example missed the step with creating new Postgre type. Good lesson here, to look into docs carefully :)

The effect:

 

Screen Shot 2017-05-07 at 23.26.43.png

As you can see, I made some approach at formatting dates. Unfortunately, I didn’t manage to show them in local time. They’re in UTC in here. This will be next step most likely.

That’s all for today. Next week we’ll try to test this new module. In the meantime, check previous episodesAnd if you’re interested in machine learning, look into my weekly link drop.

Weekly ML drop #10

I’ve become more and more interested in machine learning during last year. This is my way of collecting and sharing interesting reads on the topic I stumble upon. Those posts are published each Friday, they are divided into few categories and the format is constantly evolving.

News
In this part, I share interesting news from machine learning and artificial intelligence world. Those are mostly not very scientific articles about interesting applications,  predictions and controversies that AI causes.

Neuralink and Brain’s Magical Future
Tim Burton from Wait But Why wrote another lengthy article on one of the Elon Musk’s project. This time it’s about Neuralink, some sort of direct interface to the brain. Tim also tackled other AI-related topics, like Superintelligence, which is also a worthy read.

The Myth of Superhuman AI
Kevin Kelly, on the other hand, thinks that we’ll never get to superintelligence for those five reasons. Kevin was also mentioned in two weeks ago edition. I am currently reading his book and if you’re interested in future, it’s a must-read.

Software Predicts Cognitive Decline Using Brain Images
We talked several times about advancements in medicine, that Machine Learning brings. Image recognition algorithms are getting or surpassing human levels of detecting various threats to our health on diagnostic imagery. This article treats the topic of using a neural network to early detection of Alzheimer’s disease.

The first wave of Corporate AI is doomed to fail
The author of the article compares current wave of AI-advances to first booms of the internet and cloud computing, that failed miserably. Only after backing off and some advances those technologies hit a home run.

Waymo’s Self-driving cars will take first raiders
Alphabet’s self-driving cars company is going to run tests in Phoenix, including people outside’s of Google. People can sign up, and the rides will be for free. The goal is to check how people use and react to self-driving cars.

Learning materials
Here I’m sharing material for learning ML that I found useful – online courses, blogs, books etc. This is usually rather technical stuff.

Videos from ICLR conference has been published
International Conference on Learning Representations which took place in Toulon, France in late April, just published their video on Facebook.

Learning AI if you suck at math
This is a good article for total beginners, who not only do not have much experience in Machine Learning but also feel they’re lacking in math. It links to several good resources to jump up your algebra and calculus.

This is it for today, thanks for reading. If you liked the post, let me know and please check other parts of the series.

Weekly ML drop #9

I’ve become more and more interested in machine learning during last year. This is my way of collecting and sharing interesting reads on the topic I stumble upon. Those posts are published each Friday, they are divided into few categories and the format is constantly evolving.

News
In this part, I share interesting news from machine learning and artificial intelligence world. Those are mostly not very scientific articles about interesting applications,  predictions and controversies that AI causes.

We need tools to track AI impact on jobs market
An expert panel composed mainly of economists and computer scientists said in a new report, that The World needs a way to measure how technology impacts job market. As when we started measured our economies in the 1930s, which greatly improved government’s awareness of issues to address.

Series of articles on how AI is used in biggest tech companies
Backchannel throughout last year visited major tech companies to interview them, how they use Machina Learning and AI. Apple, Google and Facebook.

Machine learning will be a great help in detecting cancer
The article from Google’s research blog shows how assistance of machine learning algorithms will greatly improve detecting cancer.

Will democracy survive Big Data and Artificial Intelligence?
This is a longer read from Scientific American that analyses various impacts of Big Data and growth of Machine Learning will have on future societies. It also tries to answer the question what we should do now, to secure our future.


Learning materials

Here I’m sharing material for learning ML that I found useful – online courses, blogs, books etc. This is usually rather technical stuff.

Intel’s Deep Learning 102
Continuation of the webinar I linked last week. In this part, they’re taking an overview of more advanced topics, like convolutional neural networks and recurrent neural networks.


This is it for today, thanks for reading. If you liked the post, let me know and please check other parts of the series.

Broadcasting custom packages on Estimote Location Beacons with Raspberry Pi

Last summer together with Karl-Henrik Nilsson and Nejc Palir I worked on a fun project that incorporated new features of Estimote Location Beacons. It was the solution for a public transport company that used beacons for two things – delivering proximity-based features and broadcasting custom messages to customers. The second part was possible thanks to GPIO ports on (then very) new Location Beacons. We worked closely with Estimote to deliver a solution based on not very well documented feature. It was built with Raspberry Pi running WiFi hotspot powered by an external battery, which runs Node.js serving configuration application. On top of that, we built client application for iPhone and several other components.

For this post, I’ll show a simpler example, to isolate one particular feature. I’ll focus on connecting Raspberry Pi to beacon, and writing code that will change broadcasted “Local Name” of it. I’ll also use Node.js as it allows for rapid changes without the need to recompile. This will be a good occasion to talk a little bit about UART, Bluetooth data frames and simple npm package we built as a result of this project. It’s open sourced on GitHub, so you can peek inside to see our lousy JavaScript and maybe offer some improvements? :)

You’ll need one Raspberry Pi with a clean install of Raspbian, one Location Beacon and 4-wire ribbon cable. Make sure your RPi can connect the Internet and install Node.js and npm.

Connecting Raspberry Pi to Estimote Location beacon

There is a bunch of features that differentiate Location Beacons from older Proximity Beacons. They have better battery life, a longer range, can broadcast multiple packets at the same time, more sensors, internal memory and few other things. They all greatly increase usage scenarios for beacons. More importantly, Location Beacons have GPIO port. You can use it for sending signals to connected devices, for example turning on lamps when you’re in proximity. But you can also use the port in UART configuration and use it to send data in both directions. For example changing BLE frame that is broadcasted by the beacon.

GPIO port has four pins. GND, Vcc IN (5V Power), GPIO0 and GPIO1Estimote pinout.png
Picture comes from official Estimote App

rpi_pinout.png
Picture comes from Element14

When you set the beacon to work in UART configuration (more on that in next paragraph), GPIO0 will be UART_RX (receiving) and GPIO1 will be ‘UART_TX’ (transmitting). To establish proper communication with Raspberry Pi, you’ll need to connect Vcc IN to Pi’s 5V power (pin 02), GND to GND (pin 06), GPIO0 (receiving) to GPIO14 (aka TXD0 – UART transmitting, pin 08) and GPIO1 to GPIO15 (aka RXD0 – UART receiving, pin 10). So power to power, ground to ground and transmitting to receiving (so one device can receive what other transmits).

Connection scheme.png

Picture: Simplified schematic made using Fritzing. The beacon image comes from a package made by Devin Mancuso that’s available on CreativeCommons Attribution 3.0 license. I modified his work to use in Fritzing.

IMG_0568.JPG

Configuring beacon

Estimote Location Beacons by default are not configured to use GPIO in UART mode. There is no option to do so in the official app, but you can use the Estimote iOS SDK to do that. Alternatively, Piotr Krawiec built a modified version of Estimote’s “Configuration” iOS example/app template that does exactly that. You need to compile it by yourself and deploy on your iPhone. Then you’ll have to connect to the beacon and configure it. That should be all. Unfortunately, you won’t get any visual confirmation from the beacon that it worked.

Screenshots from UART configuration app

Estimote Android SDK doesn’t support enabling the UART mode yet so you might want to borrow an iPhone from a colleague. Alternatively, Estimote told us that if somebody asks for this to be added to the Android SDK, they will gladly do that.

Configuring Raspberry PI to use UART instead of console

We’ll also need some changes in RPi configuration. By default, the UART ports are set to work as external console connection point. So we need to do 2 small changes:

  1. In file /boot/config.txt add line
    enable_uart=1
  2. in /boot/cmdline.txt change serial0 to serial1 and make sure baud rate is set to 115200

Writing to the beacon

UART (Universal Asynchronous Receiver/Transmitter) is a hardware used for asynchronous serial communication. It has very lightweight communication protocol and hence is quite often used in microcontrollers. Estimote beacons use UART with custom implemented protocol, called EstiUART. You can find simple documentation for it in this package from Estimote. The way it works is that whole Bluetooth frame that you want to broadcast gets wrapped in another frame of Estimote bytes and you just send it over at the certain baud rate.

Estimote has 3 custom broadcasters and you can broadcast 3 different messages in parallel. On top of that, you can use standard iBeacon or Eddystone broadcasting packages. EstiUart allows setting transmitting power, transmitting frequency and Bluetooth package for each of them separately. To set new broadcasting messages, first, you need to enable one of the broadcasters. Then you send whole BLE frame wrapped between Estimote header end stop byte.

There are 3 special bytes that you’ll have to take special care of:

  • Estimote start byte. Every message is started with it: 0x73
  • Estimote stop byte. Every message is finished with it: 0x65
  • Escape byte: 0x5C
    One example use is when you need to encode letter ‘e’ that has an ASCII code of 101 (0x65 in hex) so it won’t be confused with stop byte.

Example frame

Let’s look at the example below. It sets Local Name of the first of three advertisers to value “Test”:

0x73,0x13,0x42,0x10,0x00,0x00,0xBB,0xBB,
0xBB,0xBB,0xAE,0x02,0x01,0x04,0x06,0x09,
0x54,0x5C,0x65,0x5C,0x73,0x74,0x00,0x65

Bytes Description
0x73 Estimote start byte
0x13 Register that will be written to. 0x13 is a register to set advertised data for the 1st broadcaster. That means, that 1st broadcaster will now transmit BLE package starting from next byte. Some other examples: 0x11 is used to set advertising power (Tx) of the first broadcaster; 0x22 is used to set advertising interval for the second broadcaster.
0x42 1st byte of PDU header, describing modes of communication. This is the beginning of Bluetooth frame. For Estimote this will always be 0x42. You can read more in this great blog post.
0x10 2nd byte of PDU header – length of the packet. In this case 16. This includes following zero-byte, MAC address, following meta information and the message itself, but not padding. For the message, you shouldn’t count escape bytes, as they’re processed in estimote and won’t be transmitted as a part of Bluetooth packet.
0x00 Must be zero by design.
0x00 0xBB
0xBB 0xBB
0xBB 0xAE
MAC address. This is so-called static MAC address. It can be generated randomly but needs to meet few requirements.
0x02 Following is a number of GAP (Generic Acces Profile) sections consisting of three parts.
1) The length of the following section. In this case next 2 bytes.
0x01 2) Type of section. 0x01 means “flags”. It’s the only mandatory section. Read more here.
0x04 3) Section Content. For Estimote this flag must be set to 0x04. Read more about flags here.
0x06 Next GAP section length
0x09 Type of section. 0x09 is Complete Local Name.
0x54
0x5C
0x65
0x5C
0x73
0x74
Section Content. This is the Local Name that will be broadcasted: “Test”

ASCII for T, Escape byte (because ASCII for ‘e’ is the Stop byte), ASCII for ‘e’, Escape byte (because ASCII for ‘s’ is the Start byte), ASCII for ‘s’, ASCII for ‘t’.

0x00 Padding. From my experience, just one byte is enough.
0x65 Estimote stop byte

Beacon-pie package

To not deal with building the frame every time, me and Karl-Henrik build npm package that takes the message and do the work for us. It’s called beacon-pie. Currently, it only supports setting Local Name, but potentially you can set this way lot of other Bluetooth properties, like offered services. We also add controls to turn on/off broadcasters and change their parameters (interval and Tx).

Here is the simplest example how to set up Local Name to “Test” with BeaconPie. You can either set up all properties in one function or all of them separately.

How to confirm that it worked?

You don’t need to write your own app to just make sure it works. Eventually, you’ll want that because that’s why you make changes on the beacon in the first place. But to just check, there’s a simpler way. Download one of the beacon reading apps available on your platform application store. I use Nordic Semiconductors nRF Connect, which is available both for iOS and Android.

IMG_0769.PNGScreenshot from nRF Connect. You can see both our “device” that transmits local name ‘Test” and one of the default Estimote packets.

Summary

So to sum things up. It’s not well documented, but it’s totally possible to use GPIO of Location Beacons as UART input and alter broadcasting messages of the beacons. I used Raspberry Pi, but it will work with anything that can send bytes. And you can construct bytes manually, or use our npm package. Feel free to contribute, fork it or steal it.

Useful links

If you’re gonna write iPhone apps interacting with Bluetooth Low Energy Devices, I found those two blog posts very helpful:

  1. Interfacing TI SensorTag with iOS Devices using Swift
  2. A BLE Advertising Primer

 

Thanks for coming by. This was the most labour-heavy post on this blog so far. I also write about functional programming and machine learning. Look around, if you’re interested.

Doing math in Elixir – calculating Great Circle Distance

In last two parts, I was setting up and consuming quick API to get additional airport data. One of this information was exact airport location on the Earth. In this part, I’ll use that information to calculate the distance of the travel and show how you can use standard math library in Erlang VM.

Calculating distance of air travel based on location is not super hard, but it needs to accommodate the fact that Earth is not flat. We’ll use something called “Great-circle distance“, as that’s how planes fly. So when you take most commonly used Mercator projection of the map, the shortest path between two points won’t be a straight line. It’s gonna have more parabolic-like shape. It’s best visible if you pick longer flights. Just go to flightradar24.com and check any plane overflying the Atlantic. They seem to be all flying towards north first and then turn south as the flight progresses. In fact, they fly a straight line, but because Eart is a sphere (in simplified model), meridians are not really parallel to each other.

After this geography primer, let’s get to math. The formula for calculating great-circle distance looks like that:

\Delta\sigma=\arccos\bigl(\sin\phi_1\cdot\sin\phi_2+\cos\phi_1\cdot\cos\phi_2\cdot\cos(\Delta\lambda)\bigr).

Where \phi _{1},\lambda _{1} and \phi _{2},\lambda _{2} are latitude and longitude of two points on Earth. This formula takes in radians and outputs radians. The result is angle difference between two points. To get the distance in kilometres, we have to multiply it by the radius of the Earth, which in the metric system is around 6370kms.

d=r\,\Delta \sigma .

To the code! I changed my functions in view, so instead of returning separately latitude and longitude, they give back a pair of coordinates as a tuple.

https://gist.github.com/mlusiak/0c934302b100a8b0f8899010fdd8422b

Then I pass the tuples to newly created function distance, that calls newly created Flightlog.Math module. I put the module in /lib folder and it all worked by the first run!

https://gist.github.com/mlusiak/031d81a94c3302bb6be9068c8bc29345

:math is a reference to Erlang VMs math library. One thing that’s needs explanation is changing degrees into radians. Radian is a different mathematical representation of the angle. 1 radian means, that the length of the arc of the part of the circle, that this degree describes equals the length of the radius of that circle. So the full circle is slightly over 6 radians (2 * Pi), as a formula for the length of the circumference of the circle is 2 * Pi * radius.

That’s all for today. Next week we’ll try to test this new module. In the meantime, check previous episodesAnd if you’re interested in machine learning, look into my weekly link drop.

Weekly ML drop #8

I’ve become more and more interested in machine learning during last year. This is my way of collecting and sharing interesting reads on the topic I stumble upon. Those posts are published each Friday, they are divided into few categories and the format is constantly evolving.

News
In this part, I share interesting news from machine learning and artificial intelligence world. Those are mostly not very scientific articles about interesting applications,  predictions and controversies that AI causes.

German retailer Otto allows algorithm to order their supplies
To shorter delivery times, Otto allows their Machine Learning based system to automatically resupply the stock, which led to shorter deliveries, fewer returns and less overall losses.

Nobody understands the Deep Learning
There are certain fields, where it is required, that algorithm used to find results must be explainable. Unfortunately, it’s close to impossible answer the question “what exactly caused this network to give this answer” given its complexity – hunders of layers with thousands of neurones. It also makes debugging and finding errors very hard.

AI can aquire biases against race, gender, etc.
There’s an old saying about computer systems: Garbage in, garbage out. It also works with ML systems. What kind of data we feed to learning algorithms, will impact how their models work. That’s why AI based on human generated data, can be not that democratic as some people promise.

Fast Drawing for everyone
This google post talks about the AutoDraw experiment, that figures out what you wanted to draw, and propose you a better representation of it. There are some limitations though. It won’t offer you a cat in similar shape, just bunch of predefined cats. And also a number of recognised objects is quite limited. Still, an interesting toy to play. And if you’re interested in science, behind it, there’s this article on research blog.

Google’s neural networks duel against each other
Most of today’s Machine Learning is so-called supervised learning. It does mean, that somebody needs to feed the algorithm with data, that’s supervised (for example labelled images). One overly simplified case could be that one network is generating cat pictures, and the other one is recognising cats and they get better by feeding data to each other.

Video
I pick one or two videos every now and then that touches an interesting subject in AI and ML field. Sometimes it’s more scientific and the other it’s about real life applications.

In the future, everything will be smart
In this short video Kevin Kelly, author of “The Inevitable“, talks how AI will be a commodity, as electricity became in XIX century and we’re on the brink of another revolution.

Learning materials
Here I’m sharing material for learning ML that I found useful – online courses, blogs, books etc. This is usually rather technical stuff.

Deep Learning 101 from Intel
This one hour webinar goes trough basic concepts of Deep Learning and how those type of algorithms perform on Intel’s stack.


This is it for today, thanks for reading. If you liked the post, let me know and please check other parts of the series.

Setting up quick API with F# and Azure Functions

As mentioned in my last week Elixir blog post, I produced some quick fake API based on Azure Functions. I thought it’s gonna take a couple of minutes, but it turned out to be a whole adventure in itself.

The creating of a function is a breeze.

  1. Go to Portal, click big green “+” sign and search for “Function App”Screen Shot 2017-04-17 at 16.34.00.png
  2. Pick Function App published by Microsoft
  3. Fill all the necessary fields like App name (must be globally unique) or location. For hosting plan I used “Consumption plan” which means, I pay only for the time that function is running. I also like to pin my stuff to the dashboard, so it’s easier to find.Screen Shot 2017-04-17 at 16.37.38.png
  4. It will take several minutes to deploy.
  5. Now you can create your functions for the app. F# is hidden in small print just above “Create this function” button. So click “create your own custom function”.Screen Shot 2017-04-17 at 16.40.40.png
  6. Then with Language drop-down, pick “F#” and for Scenario – “API  & Webhooks”. There should be on the F# function triggered by HTTP request. That’s the one you want for API.
  7. You’ll get premade piece of code with a simple function that is triggered by HTTP POST with name object and responses “Hello “.

Then I started writing the logic I wanted. I made an array of hard coded airport data. I made the function to accept only GET requests (you can change it in function.json file). In code, I parse query strings and get the airport IATA code. If I have this airport in my array, I response 200 with JSON containing the data. Otherwise, I return 404. If there’s no parameter in the query string, function answers with 500.

It’s relatively simple and straightforward F# code. I just struggled a lot with debugging. The small editor on Azure doesn’t give you static analysis, nor type information and no squigglies. You need to run the function and check for compilation errors or runtime errors. There was also some weird scoping behaviour, that forced me to declare the Airports array within the function. Anyways, after 2hrs I had an API that did what I wanted. You can see the code below. It’s not bulletproof, but it does the job. And I got to play with Azure Functions a bit.

https://gist.github.com/mlusiak/5053aabbc1c6e76db082dc8daa952c81

If you want to read more about other types of F# Azure Functions, Mathias Brandewinder wrote recently two posts about timer and queue triggered functions.

That’s all for today. Tune in next week for another part. Also, check previous episodesAnd if you’re interested in machine learning, look into my weekly link drop.