25 Mayıs 2014 Pazar

Week 2: IPython - Proof of Concept

Hello reader,

I have done an another proof of concept using IPython widgets.

In this proof of concept, we aimed to re-implement the same thing that we did before. But of course with IPython notebook.

The main idea was to have a widget with a string (Unicode) trait attribute. This trait should contain the base64 encoded representation of the PNG image. This widget captures the screen on every update and saves this image. Then encodes this image with base64. As soon as it updates its value with encoded representation of the PNG, the JavaScript part also updates itself simultaneously. This is the power of the IPython's widget machinery.

We have one problem though. We are capturing the screen with glReadPixels(). This works fine, but in the end we have numpy.ndarray in our hands. So, in order to have a PNG image, we were using PIL image library and its Image.fromarray() function to convert the ndarray into a PNG. Then, we were saving this image onto cStringIO buffer and encoding with base64. Unfortunately, this approach is very slow. We were also having this issue on the past proof of concepts, however we solved this problem using threading. But then, after a discussion we have agreed upon not using threads.

In order to solve this problem, one of the Vispy-devs, Luke Campagnola, has done a great job, wrote a pure python PNG writer. This helped us remove the PIL dependency and the need of saving onto a buffer. After some benchmarks we saw that it was the zlib.compress() function which slows down the process. But since this is still a proof of concept, we decided to let it go.

I think this proof of concept also went well. What we need now is some interactivity. We will listen user events with JavaScript and send them to Vispy to handle them. For this purpose, we are going to design a protocol (Wow, I am going to design a protocol!). In this protocol we will define the JSON representations of user events.

Next stop: Design of a JSON protocol
See you next week!


17 Mayıs 2014 Cumartesi

Week 1: Vispy/experimental - Proof of Concepts

Hello reader,
       
This is my first post since the GSoC results were declared and I have so much to say!

If you are reading this post and have no idea what I am talking about, you should read my first two posts: Hello GSoC and First Contribution to Vispy.

In the past weeks, my PR against Vispy/experimental has been merged and after some discussion with Vispy-devs, I have to make a new one.

Like I said before in my previous post, I tried to implement the interactivity part on the browser. For this purpose, I chose two of Vispy demos: "rain.py" and "hex-grid.py". The work was to listen mouse events via JavaScript and send them to Vispy using WebSocket. Since this was just a proof of concept, I have dealt with only "mousemove", "mouseup" and "mousedown" events:

rain.py on the browser



After this proof of concept also went well, we agreed that we should go to the next level and discuss the design part with IPython-devs. The main reason behind this was to use IPython 2.0's interactive widgets for the IPython notebook. So, the final decision was to have an architecture integrated nicely with the IPython notebook; but it should *also* work without IPython.

After discussing with the IPython-devs on their mailing list, we learned two things:

1- Jason Grout has some interesting work in this direction.

2- The IPython-devs recommend we use the widgets machinery (based on backbone.js and MVC) instead of going lower-level with WebSockets, JSON, Tornado, Comms, etc.

So, we needed to revisit completely the current approach. The good news was, it should be much simpler as some of the architecture is already implemented in IPython 2.0.

Now, the final idea is to create a widget with a string trait attribute that would contain the base64-encoded PNG representation of the image, and  another structure that would contain user actions (mouse position, click, key_press, etc.).
Those are automatically synchronized between Python and Javascript in real-time in the IPython widgets.

I have started to make another proof of concept on IPython notebook. I have to say that IPython-devs made an excellent work. You should definitely try IPython widgets if you haven't yet.

Next stop: Proof of concept with IPython notebook!
See you soon ;)