cfoch-dev: non-confidential development files

cfoch-dev: non-confidential development files

cfoch && GUADEC 2014

One night of 25 July, in the Jorge Chávez (Lima, Perú) airport, there was a suitcase with the image of a foot stuck in the middle of everything you could imagine there. This suitcase would be sent later to the Entzheim airport (Strasbourg, France). It arrived to Strasbourg by night and it was taken to a house in the middle of the night where it would stay about one week. There was something strange in this house. It was hosting many objects with the figure of the foot graven. In point of fact, it was not casual and something big would happen.

When I got FEC, there was a frenchman who indicated me the way to get my room. When I enter, I realized that someone else had been already there. I was unpacking my suitcase and ordering my stuff when someone opened the door: it was my roommate. Anuj Khale, a guy from India, would be my roommate for the next days. He was a GSoC student like me. After some talk in the room, we went out to go for a walk Strasbourg streets and after that each one went to sleep. The next day would be the Evince Hackfest and he was participating there.

I had been participating in the Google Summer of Code (GSoC) program in Pitivi, a video editor. During the initial period of the GSoC, I had been in contact with Pitivi guys via IRC and Hangouts, but I didn’t meet them face to face. The second day, before the Evince Hackfest, at breakfast, I met many people of GNOME: one of them was Kat, who is part of the Travel Committee and Jeff, UI designer of Pitivi and now part of the Board of Directors as Kat, who had been helping me with many good ideas about the development of my project. But I didn’t meet nobody: Julita was there, too. She is a peruvian girl, part of the documentation team, ex OPW, and organizer of many events of GNOME in Perú.

The first talk I attended to was “What’s new in Gstreamer” by Tim Muller and Sebastian Droge. I met there more people who I was talking with via IRC for months. I could meet the guys of Pitivi: Mathieu, Thibault and Lubosz, another GSoC student like me. It was really amazing to meet them in person. We talked about Pitivi, football, food while we went to have some lunch in some french restaurant. After that we went to the venue to do something in Pitivi and Gstreamer. Thibault’s brother has a flat in Strasbourg and when the conference finished we went there to continue hacking! That was so almost everyday. I realized there how difficult can be to find a place to eat when a canadian, a german, a peruvian and two french guys are all together in car: you can spent almost 45 minutes going in circles around the city to get off the car to finally decide to go eating to another place to go in circles again! Maybe it was not 1 hour, but it was so much time going in circles and that it was a really funny.

GUADEC 2014 has been really helpful for me to understand a lot of details I couldn’t understand. This conference has made me get more interested about Gstreamer because Pitivi depends so much of it. Now Google Summer Of Code is about to end and I have more ideas that I would like to implement or improve like “film titles”. I am going to continue contributing in Pitivi and Gstreamer. GNOME people, you’re really amazing. I hope to see you next year. Now the suitcase with the image of the foot stuck lies down in my some part of my house in Lima hoping to return back to enjoy a new adventure.


http://youtu.be/r07bgp6jDIM

[Poor quality video]

Hi, ImageSequenceSrc!

Hello guys. I’ve written an element in gst-plugins-bad which I called as GstImageSequenceSrc. It works very similar to GstMultiFileSrc, but there are some differences.

Differences with GstMultiFileSrc.

Handles a list of filenames instead of a printf pattern as GstMultiFileSrc does.
* Having a list of filenames is useful because you can set the filenames you want,
in the order you want. For example, you can add more filenames or sort the list.
* There are two ways to create this list:
    a) Setting a location propertie. This value could look like:

  1. '/some/folder/with/images/ or .'
  2. '/a/path/img.jpg,/other/path/img2.jpg' or 'img1.png,img2.png'
  3. '/a/path/*.JPEG or *.png'

    b) Setting the filename-list (GList) directly.

* Creates an "imagesequence://" protocol which allows the playbin to play this element. It handles a ‘fake’ uri but it is useful finally.

gst-launch-1.0 playbin uri=”imagesequencesrc:///some/path/*.jpg?framerate=20/1&loop=1”

* It “detects” the mimetype of the images. You only have to worry about the framerate.
* It calculates the duration.

Things to improve:

* Seeking: it seeks to the wrong image sometimes (actually, after many seeks).
* The way duration is calculated.

PD: stormer, the guy who hangs on #gstreamer, was telling me to support png *and* jpeg files (both at the same time) but I don’t see a usecase.

You can see the most-stable version of this element in 
https://github.com/cfoch/gst-plugins-bad/tree/sequences/gst/sequences

The develop branch in (currently it is the same):
https://github.com/cfoch/gst-plugins-bad/tree/sequences-develop/gst/sequences

\o/

\o/

How would you like Pitivi image-sequence feature look?

Hello! I’m Fabián Orccón. I’ve been accepted to Google Summer of Code 2014 for GNOME with Pitivi. My project consists on adding an image-sequence feature to Pitivi. The usecases could be creating stop-motion films or a 2D animation for example.
The idea of this feature came to my mind when I wanted to film a homemade stop-motion movie with my sister. I had Ubuntu, and by that time Pitivi was the video-editing tool by default. I realized that Pitivi doesn’t have this feature implemented, and I wanted to do it.
I’m not the expert in the arts of cinematography. I was talking with Pitivi people and Bassam Kurdali, director of the first open movie, about how Pitivi could handle this feature. After mixing and organizing ideas I have this mockup. It is just a draft; I don’t know if I’m missing something.

I would like you to tell me how would you like this feature should work. Your opinion as user is so important. Pitivi wants to be the best option for you!

image

Download | svg |



How to play a video using GES?

Hello. I want to show you an example of a command-line video player using GES. This program is very simple and my purpose is you learn the basis. I paste the enterily code here if you just want to read the code.

from gi.repository import Gtk
from gi.repository import GES
from gi.repository import Gst
from gi.repository import GObject

import signal

video_path = "/home/cfoch/Videos/samples/big_buck_bunny_1080p_stereo.ogg"

def handle_sigint(sig, frame):
    print "Bye!"
    Gtk.main_quit()

def busMessageCb(unused_bus, message):
    if message.type == Gst.MessageType.EOS:
        print "EOS: The End"
        Gtk.main_quit()

def duration_querier(pipeline):
    print pipeline.query_position(Gst.Format.TIME)
    return True

if __name__ == "__main__":
    GObject.threads_init()
    Gst.init(None)
    GES.init()

    video_uri = "file://" + video_path

    timeline = GES.Timeline.new_audio_video()
    layer = timeline.append_layer()

    asset = GES.UriClipAsset.request_sync(video_uri)
    clip = layer.add_asset(asset, 0, 0, asset.get_duration(), GES.TrackType.UNKNOWN)

    timeline.commit()

    pipeline = GES.Pipeline()
    pipeline.set_timeline(timeline)

    pipeline.set_state(Gst.State.PLAYING)

    bus = pipeline.get_bus()
    bus.add_signal_watch()
    bus.connect("message", busMessageCb)
    GObject.timeout_add(300, duration_querier, pipeline)

    signal.signal(signal.SIGINT, handle_sigint)

    Gtk.main()

We’re going to start checking these lines.

    Gst.init(None)
    GES.init()

We always need to use these lines to initialize GES. And you have to follow this order: initialize Gst before GES. If you change the order of these lines you program won’t work.

The next important line is

    timeline = GES.Timeline.new_audio_video()

With this line we create a Timeline. I understand the timeline as the general container. It will contain all the elements necessary to play the (edited) video or audio we want. A timeline contains GESVideoTracks and/or GESAudioTracks.

For example, if we want to play a video (with sound), our Timeline will contain both, a GESVideoTrack and a GESAudioTrack. If we want to play our favorite song, the timeline will contain only a GESAudioTrack.

In this case, this line is creating a GESTimeline adding to it a GESVideoTrack and a GESAudioTrack. It is a shortcut to it:

    videotrack = GES.VideoTrack.new()
    audiotrack = GES.AudioTrack.new()
    timeline = GES.Timeline.new()
    timeline.add_track(videotrack)
    timeline.add_track(audiotrack)

The Timeline not only contains tracks, but also it contains many GESLayer. Our program will contain an only one GESLayer. But a Timeline is a stack of layers. The most-top layer will have the major priority. In the default case, the priority is given by 0. But… why do we need a GESLayer? Here we’ll add our clips, our effects, transitions. So, a GESLayer is another container. We not only create a GESLayer with this following line, but also we append the created layer to our timeline:

    layer = timeline.append_layer()

What it is a kind of confusing to me is the GESAsset. I found them as an abstract idea of information of the video or audio file for example we will use. However, a GESAsset is not only limited to audio and video. It can have information about an effect or about a GESVideoTrack/GESAudioTrack.

In this case we’ll take advantage of the method request_sync GESUriClipAsset which allow us to create a GESAsset from a real file located in our computer, for example from a ‘video.ogv’. This method gets information like the duration of the video.

    asset = GES.UriClipAsset.request_sync(video_uri)

The following line will generate the clip from the asset and add it to the layer we’ve created before.

    clip = layer.add_asset(asset, 0, 0, asset.get_duration(), GES.TrackType.UNKNOWN)

And To apply our changes…

    timeline.commit()

To be able to display our video on the screen we need the GESPipeline. The GESPipeline allow us to play or render the result of our work, the result of our timeline. So a pipeline needs to be linked to a timeline.

    pipeline = GES.Pipeline()
    pipeline.set_timeline(timeline)

By default, the state of the pipeline is NULL. However, we can change this state to PLAYING or RENDER. When its state is in PLAYING the pipeline will display the result in the screen. If the state is RENDER it will ‘export’ the result of our work in an out_file.ogv video, for example. We will set the state or mode of the pipeline in PLAYING because we want only to play the video.

    pipeline.set_state(Gst.State.PLAYING)

If we remove lines bewtween 43 and 48 lines, and we execute the program, we can see the video playing. But these lines are necessary to handle when to stop the program. The program is an active process until something stop it. If we don’t stop it, when our video has finished, it will be still executing. We don’t want. We would like to stop/close/finish our program when. Ok… but… what do we have to stop? We have to stop the loop. What loop?

    Gtk.main()

Every pipeline contains a GstBUS by default. A GstBUS is a system which is watching out to some mesages; this messages could be for example: the EOS of the video or another messages. See http://gstreamer.freedesktop.org/data/doc/gstreamer/head/gstreamer/html/gstreamer-GstMessage.html#GstMessageType. We need to add a message handler to the bus of our pipeline using gst_bus_add_watch().

    bus.add_signal_watch()

If you want to know more about the GstBUS, by the way. Look at this link: http://gstreamer.freedesktop.org/data/doc/gstreamer/head/manual/html/chapter-bus.html

We’ve already added the message handler, but now we need to add the function we’re going to use to handle the message. This function is busMessageCb. ‘Cb’ from Callback. We connect the bus with this function with the following line.

    bus.connect("message", busMessageCb)

The function busMessageCb is created by the developer. The target of this function is to stop the loop create by Gtk.main() when the playing of the video has finished.

def busMessageCb(unused_bus, message):
    if message.type == Gst.MessageType.EOS:
        print "EOS: The End"
        Gtk.main_quit()

We’re finishing… this following line is just going to tell us what nanosecond of the video is the program playing each 300 miliseconds. Each 300 miliseconds our program will execute the duration_querier function which prints on the console the time of the video in a certain moment in nanoseconds.

    bus.connect("message", busMessageCb)

Finally, as an extra, I want this program being able to get closed when the user press “Ctrl+C”. The line below allow it. We’re taking advantage of the signal python library. And the program knows when “Ctrl+c” keys have been pressed because of the signal.SIGINT parameter. When these keys are pressed it calls to the function handle_sigint(sig, frame) which says "Bye!" and stop the loop.

    signal.signal(signal.SIGINT, handle_sigint)

GES sample files

Hello :) I’ve been exploring Pitivi, GES and Gstreamer during this time. I have to say it wasn’t easy to me understand how GES works. The major problem I had was there weren’t tutorials about it. So the way I was learning GES was reading documentation, looking at Pitivi code (taking all the time of the world, Pitivi code is huge) and asking in the IRC to Pitivi developers. The lack of tutorials about GES could be kind of demotivation for new people who want to help in Pitivi or just in GES. I would like new people not having the problem I had. Because of that, I will try to write some tutorials about Gstreamer Editing Services. For the while, I have samples on my github account and you can check them out.
https://github.com/cfoch/GESSamples
I will be updating this repo, not always, though. I usually create samples because I need to test some feature in specific. Anyway, I hope it would be useful for you. Enjoy it!

Ayuda a PiTiVi a llegar a su versión 1.0

PiTiVi, el editor de video open source, ha iniciado una recaudación de fondos para poder llegar a su versión 1.0.

Como muchos de los usuarios de linux pueden saber, actualmente hay pocos programas de código abierto para editar video. A ello se le suma que ninguno de ellos son muy buenos. En los últimos dos años, PiTiVi ha pasado a usar una potente librería para edición de video llamada Gstreamer Editing Services (GES). PiTiVi ahora es una UI para esta librería. Los dos desarrolladores que trabajaron en GES han iniciado una campaña de recolección de fondos en Internet para hacer de PiTiVi una versión estable, ofrecer un programa de calidad y lanzar PiTiVi 1.0. Por ahora ya se ha recaudado más de un tercio del total (35.000 euros) en menos de dos semanas.

El lema de PiTiVi es “allowing everyone on the planet to express themselves through filmmaking, with tools that they can own and improve (permitir a todos en el planeta expresarse a sí mismos a través de la producción audiovisual con herramientas que ellos puedan poseer y mejorar)”. El equipo quiere construir una aplicación de edición de video flexible y de gama alta que pueda ser usado por profesionales o aficionados.

Desde un punto de vista técnico, PiTiVi es probablemente el editor de video opensource más prometedor (disponible en Linux) porque es respaldado por Gstreamer (un framework multimedia), el cual es usado por la mayoría de las distribuciones de Linux. Se ha requerido bastante tiempo por los desarrolladores de GES y Gstreamer. Usar una librería de edición de video de gama alta como GES hace relativamente fácil la tarea de crear un editor de video para adaptarlo necesidades específicas, por ejemplo. Todas las aplicaciones que usan Gstreamer (como Totem) se beneficiarán de su desarrollo.

Esta no es una inciativa personal. En primer lugar, dos es máß que uno. En segundo lugar, GES ha sido desarrollado con una colaboración muy cercana del equipo de Gstreamer. Además, hay otros desarrolladores y contribuidores involucrados, y cada verano aparecen los estudiantes que participan del Google Summer of Code. ¡La comunidad es muy activa!

El rol de esta camapaña es acelerar el desarrollo para conseguir nuestras metas más rápido. Las donaciones van a través de la fundación sin ánimos de lucro GNOME. Esto quiere decir que las donaciones en USA son deducibles de impuestos. Por favor, vea el video de la campaña y done en http://fundraiser.pitivi.org

Gracias :o)

Exams and about gtravel-webapp

(Midterm) Exams are near. My friends and I had to explain our database model to the professor. There were some errors in the data type of the fields. But nothing dangerous. We were using ERWIN (over ORACLE). Actually, I don’t like ERWIN, but I had to use it. Anyway, the implementation in gtravel-webapp will be different to it. I am using Django+MySQL. I’ve been learning it and writting code at the same time. I was looking for some libraries/modules to verify IBAN, SWIFT/BIC codes and phonenumbers. Finally, I think I will use them.

I’ve been taking about 40-60 minutes per day to write some documentation  or code related to the project. But I think this week I’m going to be very bussy. What is really complicated in the carreer in my university is the fact we have at least 3 graded practice test each week.

Read you!

https://github.com/cfoch/gtravel-webapp

New model.

New model.

Fixed model (95%)

Fixed model (95%)

Real Time Analytics