12 Ağustos 2014 Salı

Finishing Backends

Hello reader,

Before we begin, here is how GSoC is going for me:


I have finished the static backend. In this backend, we aim to display a PNG image on IPython notebook. This is just the basic version of our VNC backend. We used IPython's `display_png()` function for displaying the PNG. I can say it is ready to merge.

On the other side, we solved the timer issue that I have mentioned in my previous post. We used a JavaScript timer with `setInterval()` function of JS window. We are sending 'poll' events from front-end (JS) to backend (python). When the backend receives these events, it generates a 'timer' event. Therefore, on the vispy canvas object that we created in IPython notebook, we can connect a callback function (like on_timer) and use it without any problem. This means we are able to do animations now. That is great!

Another good news is, we managed to hide the vispy canvas! That was a big problem from the beginning. Eric and Almar, two of the vispy-devs, proposed a solution: showing the canvas once and hiding it immediately. This solution was tested by myself before, but I couldn't manage to run it. After that, Almar took over the issue and solved it like a boss! The problem was (what I understood) Qt was refusing to draw when there is no visible canvas. So we are forcing the draw manually and voilà! Problem solved:


See you next post!

26 Temmuz 2014 Cumartesi

VNC Backend

Hello reader,

Last week we finished the Javascript part of our IPython-VNC backend. One of the tricky parts was to import Javascript code to IPython notebook as soon as user creates a vispy canvas. After solving this issue, we are now able to handle user events like mouse move, key press and mouse wheel.

About the implementation, we listen the html canvas in IPython notebook with Javascript. As soon as an event is detected, we generate the event with proper name and type according to our JSON spec. The generated event is sent to vispy with widget's `this.send()` method. This method allows us to send a message immediately from frontend to backend. When a message is received from frontend, the backend generates the appropriate vispy event. For instance, if we receive a mousepress event from frontend, we generate vispy_mousepress event in the backend. That way, user can use the conected `on_mouse_press` callback in vispy canvas.

We embeded an IPython DOMWidget into our VNC backend for ease the communication between JS and python. We want this backend to be very easy to use for a user. So s/he never needs to deal with listening events, sending them to python or creating an IPython widget or even coding in Javascript.

There are still some problems though. In mouse move event, Javascript can capture all of mouse's position. I mean every single pixel that mouse was on.. So when we are trying to generate and send mouse move event, it takes a lot of time. For example if we are trying to do a drag operation, a lag occurs because of this. Also, it is nearly impossible to capture the screen, convert it to PNG, encode it with base64 and send them through websocket in the speed of mouse move. So this is another reason why we have this lag.

Another problem is that we can not use a python timer. In vispy we use the backend's timer (QTimer for qt, etc.). But here, it is not possible to have a QTimer and IPython's event loop at the same time. We have to think a different way to have a timer. Or else we have to let go the timer based animation option.

See you next post!

6 Temmuz 2014 Pazar

Testing Screenshots

Hello reader,

This week I updated and tested the screenshot ability of Vispy.

Normally, Vispy utilizes `read_pixel()` fuction which basically read pixels from a GL context. This function returns numpy array but somehow an upside-down array. So what I did was to flip that array which is extremely easy in Python and adding a test for it. The test was very simple too. We draw to top left a white square on a black screen. Then after reading pixels, we check whether the image is flipped or not by calculating the sum of the pixels in the corners. If the top left corner's pixel values are greater than 0 and the rest are equal to 0, it means we correctly took the screenshot.  You should know that I have never dealt with drawing something with GL functions before, that was my first.

Now I am making another PR with Luke Campagnola's -a vispydev- pure python PNG writer. This will help us to remove another image library dependency, like PIL. More updates will be coming after it has been merged.

See you next post!

4 Temmuz 2014 Cuma

IPython Backend

Hello reader,

In the past weeks, I have tried to add the interactivity part on IPython. Since both IPython and our backend (glut, qt, etc.) need their own event loops, we didn't manage to handle user events yet.

Our only option was to enter an event loop whenever we capture an event from user, for example mouse press. This approach seems nearly impossible in GLUT (not in freeglut though), because the `glutMainLoop()` never returns. Using Qt's `processEvents()`, we managed this:



Using this method we are nearly finishing the static part. We are going to process events just whenever user sends an event. For the animation, we are still working on it.

Also, this week one of the Vispy-dev's, Almar Klein, has finished the base code of IPython backend in vispy/app. That means we are going to use IPython notebook officially! :)

See you next post!

19 Haziran 2014 Perşembe

Week 4&5: JSON Protocol and Implementation

Hello reader,

Last week I have my final exams and my thesis defence. Also, last week we have finished the JSON protocol. But, since I have nothing much to say I am going to post two weeks work in one post.

This week we tried to implement the interactivity part on IPython notebook. We have done a proof of concept using rain.py before, so I decided to re-use it again - Houston, we have a problem.

In rain.py, the GLUT backend is used, like most of the Vispy's demos. GLUT has a function named "glutMainLoop()" which enters the GLUT event processing loop. This is ok, but once called, this routine never returns. At this point our problem occurs. Since this function takes over control, our widgets stop syncing with Python. Thus, we are unable to handle user events!

We are currently searching a solution for that. Vispy-devs currently suggesting to avoid GLUT in such case.

Next post: The Solution (I hope!)

See you next post!


3 Haziran 2014 Salı

Week 3: Design of a JSON Protocol

Hello reader,

We are going to try handling user actions in the HTML canvas - where our visualization occurs. We have done something like that before but without IPython. Now it is time to try it in the IPython notebook. We are using JavaScript for listening the events on a canvas. So we need a protocol for sending this events to Vispy from JavaScript. Thus, we are designing a JSON protocol.

About the implementation, we are going to focus on the IPython notebook for now. It seems we have two options for that:

1- A Javascript function generates a JSON with the events, following user actions. This JSON is sent to Python via a Unicode trait in the widget. A Python callback function is automatically called when this trait is updated: it parses the JSON, generates the Event instances and passes them to Vispy.

2- We get closer to the widget machinery, following something like this.The idea is to write Python callbacks for user actions on browser's side. The widget machinery makes that possible in a clean way. Javascript is responsible for sending the appropriate properties (those that are defined in the JSON protocol) to the Python callbacks. The Python callback functions are responsible for raising the Vispy events.

My mentor (Cyrille Rossant) and I think the second approach is more elegant and straightforward. So we are concentrating on the former.

Back to the JSON protocol, we are trying to follow the same nomenclature and hierarchy as what is currently implemented in vispy.app. So far I wrote nearly half of it. It seems we need one or two weeks to implement these ideas.

Next week: Finishing Touches on JSON Protocol
See you next week!

25 Mayıs 2014 Pazar

Week 2: IPython - Proof of Concept

Hello reader,

I have done an another proof of concept using IPython widgets.

In this proof of concept, we aimed to re-implement the same thing that we did before. But of course with IPython notebook.

The main idea was to have a widget with a string (Unicode) trait attribute. This trait should contain the base64 encoded representation of the PNG image. This widget captures the screen on every update and saves this image. Then encodes this image with base64. As soon as it updates its value with encoded representation of the PNG, the JavaScript part also updates itself simultaneously. This is the power of the IPython's widget machinery.

We have one problem though. We are capturing the screen with glReadPixels(). This works fine, but in the end we have numpy.ndarray in our hands. So, in order to have a PNG image, we were using PIL image library and its Image.fromarray() function to convert the ndarray into a PNG. Then, we were saving this image onto cStringIO buffer and encoding with base64. Unfortunately, this approach is very slow. We were also having this issue on the past proof of concepts, however we solved this problem using threading. But then, after a discussion we have agreed upon not using threads.

In order to solve this problem, one of the Vispy-devs, Luke Campagnola, has done a great job, wrote a pure python PNG writer. This helped us remove the PIL dependency and the need of saving onto a buffer. After some benchmarks we saw that it was the zlib.compress() function which slows down the process. But since this is still a proof of concept, we decided to let it go.

I think this proof of concept also went well. What we need now is some interactivity. We will listen user events with JavaScript and send them to Vispy to handle them. For this purpose, we are going to design a protocol (Wow, I am going to design a protocol!). In this protocol we will define the JSON representations of user events.

Next stop: Design of a JSON protocol
See you next week!


17 Mayıs 2014 Cumartesi

Week 1: Vispy/experimental - Proof of Concepts

Hello reader,
       
This is my first post since the GSoC results were declared and I have so much to say!

If you are reading this post and have no idea what I am talking about, you should read my first two posts: Hello GSoC and First Contribution to Vispy.

In the past weeks, my PR against Vispy/experimental has been merged and after some discussion with Vispy-devs, I have to make a new one.

Like I said before in my previous post, I tried to implement the interactivity part on the browser. For this purpose, I chose two of Vispy demos: "rain.py" and "hex-grid.py". The work was to listen mouse events via JavaScript and send them to Vispy using WebSocket. Since this was just a proof of concept, I have dealt with only "mousemove", "mouseup" and "mousedown" events:

rain.py on the browser



After this proof of concept also went well, we agreed that we should go to the next level and discuss the design part with IPython-devs. The main reason behind this was to use IPython 2.0's interactive widgets for the IPython notebook. So, the final decision was to have an architecture integrated nicely with the IPython notebook; but it should *also* work without IPython.

After discussing with the IPython-devs on their mailing list, we learned two things:

1- Jason Grout has some interesting work in this direction.

2- The IPython-devs recommend we use the widgets machinery (based on backbone.js and MVC) instead of going lower-level with WebSockets, JSON, Tornado, Comms, etc.

So, we needed to revisit completely the current approach. The good news was, it should be much simpler as some of the architecture is already implemented in IPython 2.0.

Now, the final idea is to create a widget with a string trait attribute that would contain the base64-encoded PNG representation of the image, and  another structure that would contain user actions (mouse position, click, key_press, etc.).
Those are automatically synchronized between Python and Javascript in real-time in the IPython widgets.

I have started to make another proof of concept on IPython notebook. I have to say that IPython-devs made an excellent work. You should definitely try IPython widgets if you haven't yet.

Next stop: Proof of concept with IPython notebook!
See you soon ;)

28 Mart 2014 Cuma

I have won an Ipad!

Last tuesday, OBSS has organized a competition in my university. It is called ITalent - Java Roadshow.

The goal is to be the first person to answer the question correctly. By answering the question, I mean produce the code which outputs correctly.  The winner gets an Ipad:

First Contribution to Vispy

Finally, my first pull request is merged!

I have tried to make a proof of concept for Vispy in the browser project. I believe I did a successful one.

I have managed to create a server which has two threads. While main thread deals with the visualization part, the server thread serves the visualization through a web socket. I have written an html/javascript client which receives png images from this web socket and draws these images. It should be enough for a proof of concept.

It is very simple to play with it. Try it here.

Next step: Interactivity on browser!

Hello GSoC

Google Summer of Code is a global program that offers post-secondary student developers ages 18 and older stipends to write code for various open source software projects. Through Google Summer of Code, accepted student applicants are paired with a mentor or mentors from the participating projects, thus gaining exposure to real-world software development scenarios and the opportunity for employment in areas related to their academic pursuits. In turn, the participating projects are able to more easily identify and bring in new developers. Best of all, more source code is created and released for the use and benefit of all.

I have applied to GSoC this month and I am very excited about it. I found an interesting project under Vispy (under the umbrella of the Python Software Foundation) that really suits my skills.  Vispy is a real-time interactive scientific visualization library written in python. Since web based visualization has a high potential, I have chosen to bring Vispy to the browser.

I hope I will be working on this project during this summer.