What I read for Machine Learning news

Earlier this year, I was publishing weekly, or sometimes semi-weekly drops with links to interesting news, videos and learning materials for Machine Learning. I wasn’t very consisted about it and at some point just stopped.

I’m sharing here sources I get my machine learning readings from:

Blogs and aggregators

Machine Learning reddit

DataTau (HackerNews-like data science-oriented service)

Towards Data Science

Newsletters on Machine Learning

Machine Learnings (from creaters of MachineLearnings.co)

Intuition Machine

The Wild Week in AI

AI Weekly

Data Elixir

 

Have a great read!

Building IA32 assembly project in Visual Studio 2017

If you saw my presentation at DevConf you may have gotten curious and want to play with assembly language yourself. Setting up Visual Studio for compiling and debugging Intel assembly is not hard, but not straightforward. Hence, this short tutorial. I’ll show you simple setup with C file as an entry point and small assembly “library”.

Project setup

1. Create new project – C++ Win32 Console Application

 

2. On a first screen click Next, as we want into pick some custom settings. On the second mark “Console Application” and “Empty Project”. For this simple example, we won’t need any default Windows headers. Click “Finish”.

 

3. Right-click on the project, “Add” -> “New Item…”. Pick C++ file, but rename it to .c, for example “source.c”.

 

4. Add another file in a similar way and also rename it – to “lib.asm”. What names you pick doesn’t really matter, as long as C and ASM file have different. Add also header file. Name it for example “header.h”. Your project should have a similar structure to this:

 

5. Right-click on the project, “Build Dependencies” -> “Build Customizations…”. Mark “masm” as selected. This adds Microsoft Macro Assembler into the build.

 

6. Right-click the assembly file (lib.asm), “Properties”. Item type most likely says “Does not participate in build”. Change it to Microsoft Macro Assembler.

 

Code

Now let’s add simplest possible code to make it run. First assembly file. This code declares simple multiplication function that takes two arguments.

;;lib.asm

.386
PUBLIC _multiply

_CODE SEGMENT dword public 'CODE' use32
ASSUME CS:_CODE

_multiply PROC near

	push    ebp
	mov     ebp, esp   
	push	ebx
	
	mov	eax, [ebp+12]			;; second argument
	mov	ebx, [ebp+8]			;; first argument
	
	imul	eax, ebx			;; multiplying; lower 32 bits -> eax; 
						;; higher -> edx
	pop	ebx
	pop	ebp
	ret
_multiply ENDP

_CODE ENDS
END

 

Then let’s add the header, so we can use this function:

#pragma once

int multiply(int a, int b);

 

And for the finish, the C file that will call our function:

#include "header.h"

int main(int argc, char *argv[]) {
	int a = 3;
	int b = 2;

	int mult = multiply(a, b);

	return 0;
}

Debugging

Now you’ll be able to debug it like every other application in visual studio. You can set breakpoints and even Watches to see what’s in registers:

 

One problem, that I’ve noticed happening quite often, is that Visual Studio doesn’t notice changes made in the .asm file and do not recompile the project. I found that a good way to force it is to change target CPU for something different, build, change back to our preferred settings and rebuild again.

Resources

Intel® 64 and IA-32 Architectures Software Developer’s Manual

This x86 Assembly guide at University of Virginia CS

 

Good luck with your experiments and exploring your computer’s architecture!

Weekly Machine Learning drop #14

I’ve become more and more interested in machine learning during last year. This is my way of collecting and sharing interesting reads on the topic I stumble upon. I publish those posts every Friday. They are divided into few categories and the format is constantly evolving.

News

In this part, I share interesting news from machine learning and artificial intelligence world. Those are mostly not very scientific articles about applications, predictions and controversies around ML.

Inside Waymo’s Secret World for Training Self-Driving Cars
In this article, author visits Waymo’s facility for testing self-driving cars. He describes the operations there and how Waymo is using recorded real life situation, to rerun them in a virtual test environment. And this environment now evolved to simulate also previously “unseen” situation.

Microsoft unveils Project Brainwave for real-time AI
Microsoft presented new FPGA-based chip specifically designed for running high-performance machine learning computations. Together with it, they announced Project Brainwave, which (apart from the chip) consist of distributed system architecture and compiler and runtime for easy deployment of models.

How Big Data Mines Personal Information to Craft Fake News and Manipulate Voters
A darker side of Machine Learning revolution. With current computer powers and algorithms, it became very easy for evil actors to gather information about people, and create very tailored news that alters’s people opinions and decisions.

How does Physics connect to Machine Learning
Fascinating (and a bit math heavy) article on how concepts known from Physics find their translation into Machine Learning world.

Learning materials

This week I wanted to share with you some repositories with Tensorflow best practices and new Deep Learning course by Andrew Ng. 

Machine Learning for humans
It’s a series of articles targeted at technical professionals wanting to understand ML or non-technical people who are happy to engage with technical content.

Practical Deep Learning for Coders
This 7 weeks course by Fast.ai is fun, project based way to learn Deep Learning. It’s pretty heavy in content – authors suggest to secure 10hrs a week to be able to succeed with it.

This is it for today, thanks for reading. If you liked the post, let me know and please check other parts of the series.

DevDay DevConf is my favourite day! Potentially…

The first programming conference I fell in love was DevDay. It really opened my eyes when I went there for the first time in 2012 and it never failed to satisfy – I wrote about it multiple times. DevDay won’t be organized this year, or maybe even won’t be organized, period. But don’t worry – Michał and Rafał (together with other awesome people) are starting a new conference. I invite you to DevConf!

DevConf logo

I sent two talks for the CFP and both have been accepted. So I’m super happy, and also a bit stressed – this is going to be the first time I’m doing two talks at the same conference. Both new. Challenge accepted!

The first talk will be related to my growing in recent years interested in Machine Learning. I’ll try to explain basics of the technicalities of training and evaluating ML models in approachable ways. You’re probably gonna be disappointed how easy it is to get a relatively good working model. I hope to get you interested enough, that you won’t surrender when the first obstacles show up.

The second talk will be about my other fascination. How computers actually work? I’ll start with what most programmers know the best these days – one of the high-level programming languages. From there I’ll explore what lays beneath, what layers built up over last few decades. We’re standing on arms of the giants of the past and it’s a good thing to appreciate it.

DevConf is held on 13-15th September in Kraków, Poland. It’s less than a 2hrs flight from most places in Europe. One day of workshop and two days of three tracks talks for very affordable price. Make sure you stay for the weekend – traditionally we have a lot of fun there also after the conference. Register here! Hope to see you there!

Weekly Machine Learning drop #13

I’ve become more and more interested in machine learning during last year. This is my way of collecting and sharing interesting reads on the topic I stumble upon. I publish those posts every Friday. They are divided into few categories and the format is constantly evolving.

News

In this part, I share interesting news from machine learning and artificial intelligence world. Those are mostly not very scientific articles about applications, predictions and controversies around ML.

Amazon’s move signals end of line for many cashiers
The recent Amazon’s acquisition of Whole Foods is another step for the company to dominate retail. When you put that together with their Amazon Go experiment, you may start thinking how grocery stores will look in the future.

Mars robot makes decisions on its own
NASA scientists installed 20’000 lines of code on the Mars Rover Curiosity to give it some intelligence. Now it can recognize which rocks are worth closer look, instead of beaming laser at any rock found.

Artificial intelligence beat human player in Dota 2
Another game with a set of complicated rules and a nonlinear way to win was beaten by an algorithm. This time it was a solution by Open AI, using solution based on reinforced learning.

 

DeepMind and Blizzard open Starcraft II as an AI research environment
On a similar topic – now Starcraft II also has an available platform for AI experiments. It contains, among others, an API and big (and constantly growing) set of recorded anonymized gameplays. The release contains also an open source version of DeepMind’s toolkit and access to mini games, that will allow training agents for specific tasks.

How Machine Learning is transforming drug creation
Machine learning algorithms that are good at pattern recognition, can go through new and existing genetic and medical information to find unknown previously connections, which will allow creating more targeted medication.

Tensorflow 1.3 released
The previous week, a new version of Tensorflow was released. Click through, to see the list of features and improvements.

Learning materials

This week I wanted to share with you some repositories with Tensorflow best practices and new Deep Learning course by Andrew Ng. 

Tensorflow tutorials
Speaking of Tensorflow, this GitHub repository contains a bunch of tutorials, that are simple and easy to use and grasp basics of the library.

Tensorflow best practices
And this repository contains a bunch of good practices while developing with Tensorflow.

New Deep Learning specialization from Andrew Ng
After his recent departure from Baidu and forming Deeplearning.ai, there is more news from Andrew Ng. He recently published new Coursera specialization focusing on Neural Networks and Deep Learning. Many people started their adventures in Machine Learning with his previous Coursera course, and this is definitely a great continuation. I’m in the 3rd week of the first course and with a clear conscience, I can recommend it to you.

Also recently, Andrew raised $150M venture capital fund to invest in AI.

This is it for today, thanks for reading. If you liked the post, let me know and please check other parts of the series.

Weekly Machine Learning drop #12

I’ve become more and more interested in machine learning during last year. This is my way of collecting and sharing interesting reads on the topic I stumble upon. I publish those posts every Friday. They are divided into few categories and the format is constantly evolving. Last few weeks were a bit hectic, due to hosting changes, but I’m back to regular posting.

News

In this part, I share interesting news from machine learning and artificial intelligence world. Those are mostly not very scientific articles about interesting applications, predictions and controversies that AI causes.

Algorithms aren’t racist. Your skin is just too dark.
In this article, Joy shares her story on how facial recognition algorithms fail to recognise darker skin tones. You can also watch it in her Ted talk. This issue doesn’t cause just minor problems, like cameras not finding somebody’s face. With a widespread use of facial recognition by law enforcements, it can put innocent people into trouble. It’s part of the bigger problem, that algorithms can inherit human’s biases. She also calls to action She wants to collect example cases of biased algorithms to find a way to fix the problem.

Is China outsmarting American in A.I.?
China is rapidly increasing its support for AI-related projects, while the US is decreasing government spending in that area. The article looks how those changes can impact future of the technology. The problem for China may be its traditional top-down management and lack of open information exchange culture. But those things are also changing.

Software is Eating the world, but AI will eat software
Nvidia CEO, Jensen Huang, shares his opinion about industries that will be impacted by AI developments. Apart from obvious examples like automotive or healthcare, paradoxically he mentions software.

Apple is working on a dedicated chip to run AI on devices
Just another company is building custom designed chips to accommodate new processing needs of machine learning algorithms. But in contrast to Microsoft or Google, whose chips power their data centres, Apple is rumoured to plan to put a dedicated AI chip in their devices.

The next big leap in AI could come from warehouse robots
Kindred is a company, that has a different approach towards AI usage. In opposition to most of the tech companies that focus on software and build chatbots or recommender systems, they believe that the true AI innovation will come in a physical form of robots.

Learning materials

This week I have a little bit less technical and a bit more visual content here. 

A visual introduction to machine learning
It’s a very nice visual presentation, which shows the process of building Machine Learning models. Starting with data analysis, through finding relevant features up to constructing the model. The algorithm used here is decision trees, which is pretty basic ML method but can be very effective with certain datasets. This looked like a website with a great educational potential, unfortunately, this “part 1” has no new follow ups.

A neural network playground
It’s another visualisation tool, that shows basic inner workings of neural networks. As input, you get to choose several datasets and you have control over the construction of the network. You get to set parameters like a number of hidden layers, a number of neurons or activation functions and see how they impact results. It puts a bit of light into rather mysterious ways, how neural networks work.

This is it for today, thanks for reading. If you liked the post, let me know and please check other parts of the series.

Weekly ML drop #11

I’ve become more and more interested in machine learning during last year. This is my way of collecting and sharing interesting reads on the topic I stumble upon. Those posts are published each Friday, they are divided into few categories and the format is constantly evolving. Last week I didn’t have time to prepare a weekly drop, so some “news” today may be a bit older.

News
In this part, I share interesting news from machine learning and artificial intelligence world. Those are mostly not very scientific articles about interesting applications,  predictions and controversies that AI causes.

Data is the new oil
In this very interesting article in The Economist, author bring arguments how in the 21st century, data will be (and already is) the main resource to fuel the economy (compared to 20th-century oil)

Sent to Prison by a Software Program’s Secret Algorithm
A man charged with fleeing police in a car was sentence 6 years in prison. One of the information used against him was set of bar charts analysing risks and threats he’s posing to the society. It wasn’t the main proof, but it is a little bit unsettling.

The parts of America most susceptible to automation
Short article discussion which areas of US and why will be most hit by the advent of automation. There’s interesting map there also.

Harnessing automation for a future that works
In this article, on the other hand, the author claims we’re not quite there, and we’ll need close cooperation between people and machines to progress automation.

6 areas of AI and ML to watch closely
If you’re interested what’s really hot in the industry, this article lists 6 most interesting technology areas.

Learning materials
Here I’m sharing material for learning ML that I found useful – online courses, blogs, books etc. This is usually rather technical stuff.

Deep learning simplified
Series of short videos explaining basic concepts of Machine Learning. Recommended rather for total beginners.

This is it for today, thanks for reading. If you liked the post, let me know and please check other parts of the series.

Phoenix, Ecto and time zones

This is another part of my series on learning Elixir.

In the previous episode, I calculated the distance of flight. I will use it later for gathering statistics. Another metric, I would like to have is a time of flight. Usually, flight departure and arrival times are presented in local time. So it gets a bit tricky to get the time length when the journey spans across few time zones.

Elixir’s standard DateTime library doesn’t work with time zones very well. The Internet suggests I should use Timex.

But first I needed to make some changes into my database because up until now, I had only flight date. A few weeks ago I wrote a post, how to update your schema with migrations and this time I followed the same steps. Still, I haven’t figure out how to do it in a bit more automated manner.

defmodule Flightlog.Repo.Migrations.CreateFlight do
  use Ecto.Migration

  def change do
    alter table(:flights) do
      modify :arrival_date, Timex.Ecto.DateTimeWithTimezone
      modify :departure_date, Timex.Ecto.DateTimeWithTimezone
    end
    
  end
end

As you can see, I used specific Timex types. This works only with Postgre, and if you want to use timezones it needs one additional step. You’ll have to add custom type to your database:

CREATE TYPE datetimetz AS (
    dt timestamptz,
    tz varchar
);

You can read more about using Timex with Ecto on this documentation page.

I also updated my flight.ex model. It looks like that right now:

defmodule Flightlog.Flight do
  use Flightlog.Web, :model

  schema "flights" do
    field :departure_date, Timex.Ecto.DateTimeWithTimezone
    field :arrival_date, Timex.Ecto.DateTimeWithTimezone
    field :flight_number, :string
    field :plane_type, :string
    field :from, :string
    field :to, :string

    timestamps()
  end

  @doc """
  Builds a changeset based on the `struct` and `params`.
  """
  def changeset(struct, params \\ %{}) do
    struct
    |> cast(params, [:departure_date, :arrival_date, :flight_number, :plane_type, :from, :to])
    |> validate_required([:departure_date, :arrival_date, :flight_number, :plane_type, :from, :to])
  end
end

After that, I walked along the path that’s proven to work in the flight distance part. I added a new function in my math.ex library, making use of Timex diff function:

    def flightTime(earlier, later) do
        hours = Timex.diff(later, earlier, :hours)
        minutes = rem(Timex.diff(later, earlier, :minutes), 60)
        "#{hours}:#{minutes}"
    end

And I’m calling it from the view in another function, so it’s easily accessible from the template:

  def time(time1, time2) do
    Flightlog.Math.flightTime(time1, time2)
  end

And that was it. Although it took me few hours because I was struggling a bit with Timex types. I didn’t read carefully the documentation and for example missed the step with creating new Postgre type. Good lesson here, to look into docs carefully :)

The effect:

 

Screen Shot 2017-05-07 at 23.26.43.png

As you can see, I made some approach at formatting dates. Unfortunately, I didn’t manage to show them in local time. They’re in UTC in here. This will be next step most likely.

That’s all for today. Next week we’ll try to test this new module. In the meantime, check previous episodesAnd if you’re interested in machine learning, look into my weekly link drop.

Weekly ML drop #10

I’ve become more and more interested in machine learning during last year. This is my way of collecting and sharing interesting reads on the topic I stumble upon. Those posts are published each Friday, they are divided into few categories and the format is constantly evolving.

News
In this part, I share interesting news from machine learning and artificial intelligence world. Those are mostly not very scientific articles about interesting applications,  predictions and controversies that AI causes.

Neuralink and Brain’s Magical Future
Tim Burton from Wait But Why wrote another lengthy article on one of the Elon Musk’s project. This time it’s about Neuralink, some sort of direct interface to the brain. Tim also tackled other AI-related topics, like Superintelligence, which is also a worthy read.

The Myth of Superhuman AI
Kevin Kelly, on the other hand, thinks that we’ll never get to superintelligence for those five reasons. Kevin was also mentioned in two weeks ago edition. I am currently reading his book and if you’re interested in future, it’s a must-read.

Software Predicts Cognitive Decline Using Brain Images
We talked several times about advancements in medicine, that Machine Learning brings. Image recognition algorithms are getting or surpassing human levels of detecting various threats to our health on diagnostic imagery. This article treats the topic of using a neural network to early detection of Alzheimer’s disease.

The first wave of Corporate AI is doomed to fail
The author of the article compares current wave of AI-advances to first booms of the internet and cloud computing, that failed miserably. Only after backing off and some advances those technologies hit a home run.

Waymo’s Self-driving cars will take first raiders
Alphabet’s self-driving cars company is going to run tests in Phoenix, including people outside’s of Google. People can sign up, and the rides will be for free. The goal is to check how people use and react to self-driving cars.

Learning materials
Here I’m sharing material for learning ML that I found useful – online courses, blogs, books etc. This is usually rather technical stuff.

Videos from ICLR conference has been published
International Conference on Learning Representations which took place in Toulon, France in late April, just published their video on Facebook.

Learning AI if you suck at math
This is a good article for total beginners, who not only do not have much experience in Machine Learning but also feel they’re lacking in math. It links to several good resources to jump up your algebra and calculus.

This is it for today, thanks for reading. If you liked the post, let me know and please check other parts of the series.

Weekly ML drop #9

I’ve become more and more interested in machine learning during last year. This is my way of collecting and sharing interesting reads on the topic I stumble upon. Those posts are published each Friday, they are divided into few categories and the format is constantly evolving.

News
In this part, I share interesting news from machine learning and artificial intelligence world. Those are mostly not very scientific articles about interesting applications,  predictions and controversies that AI causes.

We need tools to track AI impact on jobs market
An expert panel composed mainly of economists and computer scientists said in a new report, that The World needs a way to measure how technology impacts job market. As when we started measured our economies in the 1930s, which greatly improved government’s awareness of issues to address.

Series of articles on how AI is used in biggest tech companies
Backchannel throughout last year visited major tech companies to interview them, how they use Machina Learning and AI. Apple, Google and Facebook.

Machine learning will be a great help in detecting cancer
The article from Google’s research blog shows how assistance of machine learning algorithms will greatly improve detecting cancer.

Will democracy survive Big Data and Artificial Intelligence?
This is a longer read from Scientific American that analyses various impacts of Big Data and growth of Machine Learning will have on future societies. It also tries to answer the question what we should do now, to secure our future.


Learning materials

Here I’m sharing material for learning ML that I found useful – online courses, blogs, books etc. This is usually rather technical stuff.

Intel’s Deep Learning 102
Continuation of the webinar I linked last week. In this part, they’re taking an overview of more advanced topics, like convolutional neural networks and recurrent neural networks.


This is it for today, thanks for reading. If you liked the post, let me know and please check other parts of the series.