For migrating a very large Plone 3 instance to Plone 4 we wanted to walk through the Zope database and avoid using the Plone catalog. Looping through the results for ZopeFind with search_sub=1 (which includes sub folders) means that it takes a long time to generate a massive list of results first, before it can do anything with them. With a large database this also uses a lot of RAM. What we needed was a recursive generator, but it took me a long time to wrap my head around how to write one. This article explains it nicely: http://linuxgazette.net/100/pramode.html.
I didn’t get it until I realized that when you call the function you get back a generator (because you yield a value). This is why you have to loop through your recursive call. Python generators were introduced about 10 years ago, I’m only starting to realize what I’ve been missing!
portal = app.portal
for idx, sub_node in node.ZopeFind(node, search_sub=0):
if getattr(sub_node, "meta_type", "") in ['ATBTreeFolder', 'ATFolder']:
for sub_sub_node in walk(sub_node):
walker = walk(portal)
Incidentally, I came across a very creative solution to traversing the Zope database in collective.solr:
One of the nice things about having declarative configuration in xml files is that you can use standard xml tools to do interesting things with them. XMLStarlet is a handy command line tool for manipulating xml. For a migration from a Plone 3 site to Plone 4 I wanted to check some things e.g. Portlets Generic Setup syntax changes explains that the use of the “for” attribute in a portlet element is deprecated. I would like to know if we have any Generic Setup xml with a portlet element which has a “for” attribute:
$ xmlstarlet sel -t -m //portlet[@for] -c . some.xml
This uses the XPath expression “//portlet[@for]” to select any >portlet< elements which have a “for” attribute. It returns a copy of any elements which match. Let’s combine this with find:
This can easily be adjusted for more complicated queries. The interesting parts are the XPath expression and if you want a copy of the whole matching element you can use “-c .” as above, if you want the value you can use “-v”. See http://xmlstar.sourceforge.net/doc/xmlstarlet.txt for more info.
For the recent Plone conference in Munich we decided to try to do the video ourselves. The primary motivation was to save some money since the professional video companies which were contacted requested several thousand Euro (which is perfectly reasonable for covering 30 talks) but our budget was limited. We also wanted to get the video online as soon as possible but no conventional video companies were able to offer this since they required additional time to do all the typical post production work. We decided to use the same process that Debian have been using for their conferences for years and has been adopted more recently by PyCon US and FOSDEM among many others.
The key component in this setup is DVswitch: ” a digital video mixer intended for interactive live mixing of several incoming DV video streams”. This is really simple to use and allows someone without any experience in video to do the editing while the talk is taking place. This cuts out the need for lengthy post-production. It is even possible to stream this video live although we decided to keep things simple for our first attempt and just record the video to be uploaded at a later stage. As the editor you can choose to display the output of the laptop fullscreen, the output of the camera fullscreen, or you can draw a rectangle anywhere on the screen to display the output of one or the other (picture-in-picture, see illustration 2 below).
We needed to cover two rooms and in an effort to keep the setup as simple as possible we decided to just use one laptop, camera and mixing desk per room. I learnt two important things about Firewire when setting up and configuring the systems. First of all, it is only possible to use one video device per controller. We were lucky that both of our laptops had Firewire inputs as well as expansion slots which allowed us to add a Firewire card to each laptop giving us the two controllers we needed per laptop (the two additional Firewire ports on the card could not be used). Otherwise we could have used two laptops per room and connected them via Ethernet. This is actually the more typical configuration. The other thing I learned was that you should never connect a Firewire device while it is turned on! I was trying to figure out why a particular camera would not work consistently with the system when I learnt about this. Luckily nothing was damaged, but if you search for Firewire Hotplug Damage you’ll quickly find many horror stories. Admittedly, all the stories I came across were from Mac users so maybe “It’s a Mac thing”®, but I have no intention of testing this 🙂
The cameras we used were good old Sony DV cameras, a VX2000 and a VX2100. These are really nice cameras with great low light capabilities and plenty of manual control. The most important thing was that they have Firewire outputs and Mic inputs (mini-jack rather than XLR). These worked without a hitch, unlike a modern HD camera which we borrowed to test which occasionally would just stop working and needed to be power cycled to start working again.
Laptops and cameras are pretty easy to come by, but the other piece in the puzzle is a frame grabber. This takes the VGA output of a laptop and digitizes it so that it can be sent as a DV stream over Firewire to the laptop, it also sends the VGA output to the projector. Many thanks to the good folks on the DVswitch and PyC0n-AV mailing lists for guidance and advice on the available options. We stuck with the Canopus TwinPact 100 which is what most people use. It’s not cheap at over €500 per unit, but we decided it would be a worthwhile investment since we can use it again in the future.
Connecting it all together is quite straight forward. The diagram below is not exactly to scale so it’s important to note that the camera, TwinPact and laptop have to be pretty close because Firewire cables can’t be very long (~5M is OK). VGA and XLR/audio cables can be very long. For one talk we joined two 10M VGA cables with a connector and it worked perfectly. The audio cable was 25M in one room!
If the language of the conference is written from left to right, then the ideal configuration is to have the projector screen to the right hand side of the speaker (as the speaker faces the audience). This often leaves a neat little corner on the bottom right of the screen for inserting the video of the speaker. You might also be able to throw some extra light onto the speaker without it washing out the image on the projector screen.
One of the laptops was bought with the intention of using this system again for other events. This hadn’t been heavily tested and we were unfortunate to run into an issue with the Intel graphics module, Ubuntu and Gnome 3. Switching to the 2D Unity desktop helped considerably, although we did have further issues which seemed to require a reboot once a day. The other laptop was until recently my primary development machine (thanks Syslab!) so I knew I could trust it and I set it up myself. NixOS is my distro of choice so I used a simple config with just XFCE, DVSwitch, Firefox, Pidgin and VLC. Ancient proverb: “If it’s not tested, it’s broken”.
Audio was a constant concern. We noticed a hum from the PA which we couldn’t do anything about, and there was considerable hiss on the other system. As the conference progressed we got a bit sloppy with the cables and at one stage an unshielded audio cable (Mixer -> Camera) was running straight along a power cable!! That produced a disturbing ground loop hum which couldn’t be totally cleaned up in post production. Apart from that, there was a bit of feedback from the PA system dependant on the position of the speaker, and mobile phones and UMTS/3G dongles occasionally played havoc.
There is really only one button on the TwinPact control which you need to press. It’s the Overscan button. Sometimes, (it only seemed to happen with Apple laptops, but not with all of them) the picture from the TwinPact would go crazy even though the image from the projector was fine. Pressing the overscan button caused a thick border to be displayed around the image but apart from that it was fine.
Apart from these minor issues it was a roaring success. There is a palpable fear in the Plone community right now that some of our brightest stars will be talent spotted on YouTube and seduced into signing deals with major movie studios, much to the detriment of the Content Management Systems of the future. If you are a such a talent spotter please don’t look at the videos.
Post-production notes (maybe a separate post):
Kdenlive: in-out points and transcoding
Audacity and ffmpeg: audio cleanup
The spectacular AV-Team: Geraldo, Hans-Jürgen, Sonnja, Toni. (I will add links)
The people who made it possible: Max, Patrick, Philip.
Everyone else and also http://www.ragemaker.net and the rage comic community for making enterprise grade illustration material attainable to everyone in a world gone off the rails.