Sense Hofstede » Planet Ubuntu Wed, 20 Aug 2014 16:19:52 +0000 en-GB hourly 1 Just For Learning: an online classroom from Ubuntu Nederland Thu, 12 Jan 2012 09:51:28 +0000 Since it is almost that time of the year again when the Ubuntu Developer Week will start, I would like to introduce you to nice tool developed by a team in the Dutch Ubuntu community: Just For Learning. Very much like the well-known desktop application, Lernid, the goal of Just For Learning is to make giving online workshops easier. Instead of having to learn how to connect to two IRC channels, they only have to visit this easy web application.

In my local community, Ubuntu Nederland, we have a team that occupies itself with just that: educating the community. The team of Ubuntu NL Mwanzo is very active in reaching out to newcomers to our LoCo. A part of their pursuits concerns giving workshops just like the Ubuntu Developer Week I mentioned earlier. They usually employed IRC to hold these sessions, following the familiar #*-classroom/#*-classroom-chat paradigm. However, they thought it could be done better and a group of interested people got to work.

The application, available in both English and Dutch, has three IRC-based chatrooms: one for the teacher, one to ask and answer questions and one for the general chatting. The fourth pane shows a schedule and displays links mentioned in the classroom. People who want to follow the course with their IRC client can choose to do so too. Teachers are authenticated using the Ubuntu SSO Service. Students only have to give a user name.

The Dutch instance is live on As you can see, the interface is very simple and easy to use. When you hold your event, all you need to do is provide a link to the site and your audience can join in from any browser on any device with internet.

If you or your team is interested in using this, or have any questions, please look at the project’s Launchpad page or leave a comment here. You may also try the #Ubuntu-nl-mwanzo IRC channel. The code is located in the Bazaar branch lp:justforlearning. The code is still not stable!

]]> 0
A community needs something to do Thu, 06 Oct 2011 11:27:22 +0000 Reading Amber Graner’s post about Tuesday’s meeting of the Community Council, I recognised a feeling that I also had when I stopped being an active member of the community in January this year. At that time I felt depleted of the enthusiasm for the community that once flew through my veins. What made my enthusiasm go away?

Apart from changes in my personal life, I was also affected by a sense of purposelessness. I had no idea what I was doing in the community anymore, so I just quit. I feel that this is a general problem in the community. A successful and happy community needs something to do. It needs a responsibility, something it can focus on. For the greatest part of the five years that I have been an active part of the Ubuntu community, I felt part of a group that was working together on creating the best distribution. There was a clear goal: making the best operating system of them all.

The difference with today is that the primacy of innovation, of change, appeared to be much more in the community back then. We were all excited about the upcoming changes and the direction we were heading towards, because we all knew what those changes and directions were. For people to be excited about something, they need to know what to expect.

Ubuntu has matured enormously. Canonical has acquired so many skilled people that I do not fear for the quality of Ubuntu. It is only going to be better. However, that maturation has come with a price: as Canonical moved more and more to an Apple-style secrecy surrounding its plans, the community has been robbed from the vital basking in the glory of the upcoming changes. Because of the way the plans are announced, the community also doesn’t always feel them to be theirs.

Is this a bad thing? Not necessarily. I believe that Canonical’s approach will probably lead to better designs. However, the atmosphere in the community needs to be improved if we want to keep everyone motivated and give the people a sense of purpose. Currently, the responsibility of the community is quite uncertain. On the one hand, Canonical makes it appear in its communications that the community has more influence than it actually has. It says that all employees are community members, whereas many are in fact not. It decides many things on its own, but then says it involves the community. But Canonical seems to steer the Ubuntu project on its own.

I do not disagree with the way Ubuntu is run. However, I do believe that it is vital that the truth is not denied, like the political ‘leaders’ of Europe currently do when talking about Greece. Saying that Ubuntu is created by the community does not make that true. Stop it. Honesty will improve a lot, because it will reduce unrealistic expectations.

With the fallacy of the community running the project removed, the community does need something to replace that. To return to the beginning of my post, what I believe is causing the leadership lethargy that was mentioned in the Community Council, is uncertainty about the responsibility of the community. It should be made clear exactly what role the community plays in creating Ubuntu. What decisions can it make? What can it contribute? What is the reach of the authority of the Community Council over the project? Once that is clear, the roles of the different leaders can be defined within that responsibility. Then they know what their purpose is. Having a purpose, having influence motivates people.

Somewhat related: maybe it would be a good idea to make in every team somewhat responsible for community and contribution management. For example, if you as a community volunteer contribute code to the desktop, who will look after you? Who will make sure that your work doesn’t go to waste? Such a change would also take of stress from the shoulders of the community team, which should not be used for such wide purposes.

]]> 2
Making reporting bugs harder: desirable? Wed, 03 Aug 2011 16:19:51 +0000 When I started writing this post, the latest bug report on Launchpad was bug #820459. That’s right, since the start of the Ubuntu project there have been 820,459 bugs reported on Launchpad and its Ubuntu Bugzilla predecessor. Though it includes bugs reported against other projects on Launchpad, the majority of those bug reports are related to Ubuntu.

The number of bugs reported every day is huge. It’s a continuous flow of problem reports, Apport crash reports, wrongly placed support requests, trolling, feature requests and distress. Heroically fighting to stem the flood is Ubuntu Bug Squad. Together with specialised bug triage teams for certain packages, like the kernel, they try to process as many useful bug reports as they can. However, there are too little triagers for too many bugs.

The current situation is not good for the people who work so hard to process all the reports; many leave the team soon after joining. It also causes relevant bugs to be lost in a sea of unprocessed or half-processed bogus bugs that clog up the system. It has been proposed before, but maybe we should once again seriously consider discouraging non-technical users from reporting bugs.

If we’d decide to do so, regular users would be kept away from the bug tracker. Only for automatically generated crash reports from Apport should be allowed, because the process is such that bogus reports rarely happen and many triage steps for this particular kind of bug can be automated. We would remove the ‘Help->Report a bug’ everywhere, including alpha releases. Links to reporting a bug should be removed from the documentation and the official sites. Launchpad could be adapted to make the ‘Report a bug’ button less obvious.

All this should lead to less bug reports and a higher average quality of the reports. If we focus only on the technically capable and interested users, then we’d have less clueless reports. It would save the time, energy and motivation of the bug triagers, which could then focus on making sure every bug that would be reported, would be processed quickly.

However, we should not forget that one of the things Ubuntu often is credited for is the large amount of bugs forwarded to upstream. Furthermore, an even more important argument in favour of bug reporting for the masses, is the fact that technical users use their computer different than non-technical users. They might miss bugs that non-technical users do encounter or see no problem in a feature of the system that is terribly confusing for non-technical users.

Limiting bug reporting would deprive us of this and that seems sufficiently bad to me to doubt whether we should limit bug reporting at all. I really don’t know what’d be the best. Making it harder to report bugs would make managing the bugs easier, but wouldn’t that also make the bugs we manage worth less? What do you, oh dear reader, think?

]]> 1
Ubuntu needs the GNOME 3 project, all of it Mon, 25 Jul 2011 20:06:15 +0000 Watching the news of Apple’s release of OS X Lion and the cheering reviews that followed, the huge quality of what we are up against becomes very clear once more. If you look at the operating system that Apple is delivering, you see not only the polish that it is so famous for. It also delivers functionality underneath that polish. You can make your operating system as user friendly as you want, but you will still lose if you cannot do much with it.

The large success of Ubuntu we’ve seen in the recent years has come mostly due to the fact that Canonical is very good at adding polish to the functionality that was already there. It made the great tools of the free desktop software usable by everyone. They still do this wonderfully and I have full confidence that the Canonical Design Team will continue to make Ubuntu suit its users even better.

However, while reading OMG! Ubuntu!’s post about the music and document file browsers mock-ups something struck me. Something that started to bug me while trying out GNOME Shell now became clear. Canonical may be very good on polishing, it may be very good at innovating user interfaces, it cannot do without GNOME. It lacks content.

Now, I don’t mean this in a demeaning way. I have great respect for the vision that speaks from Unity. However, I would like to emphasise that working within GNOME would be much better for Ubuntu on the long-term, no matter how hard it will be in the short-term.

When comparing Unity and GNOME Shell, I noticed right away how clearly a philosophy speaks from GNOME Shell. When using GNOME 3, you can really notice how its developers have purposely worked together to create a coherent experience. It feels nicer than Unity. Plus: it uses GNOME technologies and that improves its integration in the rest of the desktop tremendously.

But after a while you start noticing a few things. GNOME Shell is less stable than Unity and it feels less solid and responsive. Moreover, whereas Unity’s rough edges are at its rough edges, GNOME Shell has rough edges spread equally all over. GNOME 3 looks less slick and sharp than Unity, GNOME 3′s default theme is less crisp than Ubuntu’s Ambiance.

It is a terrible shame that the huge effort Canonical made to get Unity to the high level it currently is, was not spent on making GNOME Shell even better. Canonical may be stubborn, but the company has great ideas and it could have done so much to make GNOME Shell really slick.

Canonical is not a very huge company. It does not have enough employees to create and maintain a whole desktop. This is already showing in the stalled innovation of Notify OSD and friends; I am absolutely jealous of GNOME Shell’s notification area. While GNOME is working on expanding and improving its GNOME 3 desktop, Canonical is still very busy with its own shell. The consequence of this is that the shell does not integrate as much in the rest of the applications as you’d hope. There is a lot to improve in the GNOME project, but when you improve it, you are sure that it fits with the rest of the desktop and that it will look and behave the same.

The Documents and Media file browsers I mentioned earlier can be great ways to give users access to their files. However, every time you implement a way to access stuff like this, you make a paradigm choice. If you want to satisfy the user, it should be consistent. Unity also gives the user access to files, but it does so in a different way. This causes a collision of paradigms. If Canonical wants to do it right, it should ensure consistency across all applications. This is a lot of work and will probably require the development of its own file manager, etc, in the long term.

Canonical does not have the workforce to fully maintain its own desktop. By creating its own shell, it may improve things in the short term, but it will only make things worse in the long run. While GNOME progresses along a different path, the two desktops will diverge even further. In the end, if we ever want to beat Mac OS X, Ubuntu will have to to get rid ofGNOME and Canonical will have to have grown substantially.

GNOME needs Canonical as well. There is no other company in the Linux distribution world that focusses on regular consumers and regular consumers are the target group that shape the OSes of today. I’m not sure how much longer Novell’s remains will stay around, Nokia seems to be on a suicide mission and Red Hat is a business oriented company. GNOME 3′s magnificent user interface philosophy is in need for a good set of clothes and proper manners and of all companies that are in existence today, Canonical is the best candidate to look after that.

My ego is not so large that I believe this blog post can change Canonical’s company policy—which naturally wasn’t thought out in one hour—but I do wish to add my voice to the chorus that say: Ubuntu should return to GNOME 3!

]]> 2
Hooking your guitar on PulseAudio: out-of-the-box easy! (with PCM2904) Tue, 21 Jun 2011 10:34:33 +0000 Yesterday I bought myself a shiny new toy: a Midiplus Audiolink. With this USB-device you can connect your guitar to your computer. (According to lsusb the chip is a ‘Texas Instruments Japan PCM2904 Audio Codec’.) I may not have an electric guitar, but my wonderful Crafter TC035/N does have an element pick-up and built-in preamp. With this device I can record and play-back without having to buy an expensive amplifier.

When I bought it, I got a driver CD as well as warnings for Windows Vista. I hadn’t checked for Ubuntu compatibility, so I did expect some troubles and braced myself for hours and hours of fiddling with JACK to make it work. So I started with installing Ardour and trying if I could get anything recorded there. Nothing.

Then I remembered my system was sporting the flexible PulseAudio. So, I went to volume control, saw the Audiolink listed under input devices. After I had selected it, I opened the GNOME Sound Recorder, pressed record and started to play. Bingo! Everything just worked. No fiddling needed. All you have to do is to select a different input, using the default volume control.

This really demonstrates the power and user-friendliness  of PulseAudio. I know that with JACK it is possible to do quite a lot with input sources, but that is too much for what I want. I just need something simple and this is working great for me! Thank you PulseAudio developers!

]]> 0
GNOME 3: configuration wish granted Sat, 09 Apr 2011 10:56:53 +0000 Almost two and a half year ago GNOME 3 was in a very early state and most plans still had to be drawn up. At that time there was an interesting meme going on at Planet GNOME in which people blogged about their wishes for the large changes that would come to GNOME.

Inspired by that, I also blogged about what I would love to see in GNOME 3: ‘My wish for GNOME 3: better configuration tools‘. My gripe was the huge amount of tiny little configuration utilities that swamped the Preferences menu. Not only was it confusing to spread the settings across so many different windows, but it also looked bad and navigated horribly. The settings were spread out and not always easy to find.

A few days ago I installed the GNOME 3 LiveCD on my system to give GNOME Shell a try. I was amazed by the great progress that has been made with the GNOME Control Centre. Not only does it look much, much better than its previous incarnations, it also works much better.

Everything is now by default in one place. Navigation has improved. The way the configuration ‘capplets’ are designed is a lot more intuitive. It is not as cluttered anymore as it used to be and is easier to control. It is really a shame that it is not in Ubuntu Natty, because it is a tremendous improvement over what you currently get when pressing System Settings in the Session Indicator.

Looking back at the blog post I wrote in 2008 I can say that GNOME 3 has fulfilled my wishes for better configuration tools. It’s a great release, congratulations on this milestone, GNOME!

]]> 1
The difference between local communities and local teams Tue, 05 Apr 2011 17:35:19 +0000 The official directory of Ubuntu’s Locos goes under the name ‘Ubuntu Local Community Team Directory‘. This neatly covers both names that are frequently used to denote the different types of groups that are locally active for Ubuntu: local communities and local teams. Although no one is currently actually making that distinction, I would like to suggest otherwise. I believe there are two sorts of local groups, between which there are clear differences. It is important to be aware of this fundamental disparity if we want to accommodate both types as good as possible.

Let us first take a look at the defined purpose of the ‘local community teams’. The ‘About Local Community Teams‘ page at the Loco Directory has the following to say about it:

With the incredible success of Ubuntu around the world, the LoCo project is here to help groups of Ubuntu fans and enthusiasts work together in regional teams to help advocate, promote, translate, develop and otherwise improve Ubuntu.

Local community teams are supposed to cater a definite geographic area. In the United States and the Russian Federation they ought to cover a state, in the rest of the world a complete country. This geographic constraint is important to notice, since we will see later that it doesn’t always fit well in the case of ‘local communities’.

Within their geographic area, local community teams are expected to advocate Ubuntu, organise local activities and informal meetings. This is all ‘in real life’, the internet presence is often very limited. Teams that behave according to this description, are what I would like to call ‘local teams’. Basically all local community teams from the English-speaking countries are of this type.

Since the international Ubuntu community is English-speaking and provides excellent support via the Ubuntu Forums, Ask Ubuntu and the #ubuntu IRC channel, the internet communities of the English-speaking local community teams don’t have to have a broad own community online. They are limited to small forums that are mostly used to discuss the activities that take place ‘in real life’ and to complement social contact. For the most part they are integrated into the international community.

Virtually all other local community teams, however, can be classified as ‘local communities’. Because many, if not most, of its members do not speak English, they cannot use the online support from the international community. The main focus of these ‘local communities’ is not so much activities ‘in real life’, but instead the management of a language-specific Ubuntu community. Running a community is not easy, so it takes a lot of resources.

The local teams have much more time and energy to spend on organising local activities and meetings, because there already is an online community for them, the international one. You can call the local teams a team because they often consist of equals, working together to spread Ubuntu and enjoying each others presence.

A local community, in contrast with a local team, does not consist of equals working together on the same things. There are many more different functions within a community: there is support to be given, documentation to be translated and written, interface text to translate and all of this needs to be organised. Furthermore, those who come to ask for support are a fundamentally different kind of community member than new computer enthusiasts who joins a local team to help out. Many of the people who come to ask for support are regular users who don’t have the desire to become an active contributor, they only want help. The spirit of a local community is therefore very different from that of a local team.

What makes this even more complicated is that language boundaries and geographical boundaries often don’t match. For example, Spanish is spoken in a lot of different countries by many people across the whole world. Therefore, having one Spanish support channel and forum makes sense. Add to that the Spanish translators of Ubuntu’s interface and documentation and what you get is a very large community, separate from the international one, transcending geographic borders. The difference between this conglomeration of several ‘local community teams’ and, say, the local team of the US state Massachusetts, is one of day and night. Countries like Belgium, where they speak Dutch, French and German, are even more complicated, because you have multiple language communities within the same borders.

Local communities and local teams both do very valuable work. I think it is a shame that many local communities are not locally as active as local teams often are. But this does have a reason: founding, building and running an full-blown online community puts a hefty toll on your volunteers. If you are small, like Ubuntu Nederland, you don’t have a lot of spare persons left to organise social activities ‘in real life’.

Duplicating the international Ubuntu community in your own language is hard, it requires skill to do it successfully. Good documentation can help with that. Unfortunately, most of the documentation—written in the English-speaking international community—focuses on local teams instead of local communities. Also, the requirements of the LoCo Council put a lot of emphasis on local activities, whereas it can be a tremendous achievement already to ‘only’ have a solid online presence.

I should add some nuance to this. Of course there are local community teams that under my classification would be local communities, but which do have a strong local presence; there are probably also local teams that have a large own online community. I acknowledge this and think that the definition of a local team and a local community should be stretched enough to suit this. However, the general observation still applies.

What to do with this analysis? Firstly, I think that the international community should be more conscious of the fact that non-English local community teams are distinctively different from English-speaking local community teams, because of the boundary created by language. We should be aware that there are several parallel Ubuntu communities in other languages that over time may have grown own identities. They may see themselves not as a localised annexe of the international Ubuntu community, but instead as the equivalent for their own language of what is often perceived to be the English Ubuntu community, but is in fact the international one.

Secondly, there also is a task for the leaders of the local communities, who could make their people more aware of the way things work in the Ubuntu project and explain what more there is to do if you learn English. They can help to bridge the gap between the different language communities.

Thirdly and finally, the LoCo Council should take into account when judging the performance of local community teams that not all of them have the extra burden of having to build a complete new community from the ground up.

]]> 0
Localisation for the USA, necessary too Sun, 03 Apr 2011 12:21:40 +0000 By convention the default locale of all applications is US English. This is of course very imperialistic and evil and the Americans are indeed forcing their culture upon the rest of the world. But in the end we need to have a default for the ‘C locale’ and it was decided to stick with the language used in the place where most of the modern computing actually originated. Using Latin would have been a bit awkward and even Esperanto isn’t entirely culturally neutral as well.

One could argue that Americans—no, I’m not going to write USanians—derive a large advantage from the fact that the default locale is their English variant. All software is understandable for them right from the beginning. They never have to wait for translations. However, in this piece I would like to argue that actually it is a disadvantage.

The Disadvantage

Why would it be a disadvantage to Americans that all software automatically suits their customs and follows their local quirks? Well, for that I would like to do a game of compare and contrast. Mostly contrast. You see, the US English strings are the only texts written by the developers themselves.

Development attracts people who like to develop, not people who like to write. They do not necessarily come from the United States, often are not native English speakers and many of them can’t see the use of arguing about -ize vs -ise, or have own opinions about it. The consequence is that the US English strings are written by people whose primary interest is writing code, not human language. This is detrimental to the quality of the texts, the suitability of the chosen phrases and spelling and grammar in general.

Translation teams, however, attract people who are interested in language. In the world of perfect localisation, all typography nuts, grammar enthusiasts and spelling bees will join together to form a team with Super Language Powers. This means that the people who will write the text you see every day on your computer are fond of language, know how to use it and have experience to say it, if you speak any language or dialect other than US English.

The Consequences

All languages—except US English—have a corrective filter between the developer’s work and the end-user. There is one community that oversees all use of language in the product. Translation teams often work with word lists, style guides and selection of contributors based on their quality. This allows them to guarantee quality, make sure that all text on the system follows the same conventions and warrant consistency across the desktop. You can correct for overuse or underuse of capitalisation, distinct between the computer and user in events by using different verb conjugations and so on.

Consistency is an important issue. For example, a computer can have a screen, a display and a view. These words are near synonyms, but the X server uses them to distinguish between three different things. It is hard enough for a user to understand what the system is talking about already, it becomes even harder when words can have different meanings in different applications. When there is no central organisation of the terminology, this does happen. Translators could correct for this by adapting the translations to the context, but Americans are out of luck.

There are no people looking after the typography, grammar, capital use and readability of US English. But there is more to localisation: translation teams also make sure that the system is using the correct date format, currency, decimal delimiter and so on. Each country has its own conventions here. No one has the job of nitpicking about the American conventions, so they’re missing a watchful eye here as well.

The result is that the US English desktop can often be inconsistent in style, word choice and spelling. This makes our product less appealing to Americans and to other people using the US English version. If we want to pursue perfection, we should not miss this out.

It has also consequences for the translations. The translations are translations of the original English texts. Although I did say earlier that translators can correct for inconsistency and bad wording, they don’t always do. It is a lot of work to manually check the context of each and every string, many translators just stick to translating every word with the same phrase. Badly used capitals and dots will often find their way into translations as well. Vague US English results in vague translations.


Improving the quality of US English will mean large improvements for all languages if it is done properly, by sending patches with corrections to the developers. I am convinced that we need an American ‘localisation team’, consisting of all American typography nuts, grammar enthusiasts and spelling bees who want to contribute to Ubuntu and FOSS in general. They could work together with other projects to establish conventions and methodically go through all applications to check whether they comply with these conventions.

We cannot ask from all the localisers to understand programming language and patching systems. However, with the current state of technology, I am afraid that writing patches directly for the code is the only option. In the long term, something like a POT editor and a reverse POT generator could improve things.

It would also require infrastructure. Many languages have project-agnostic communities for translation in FOSS that provide various language-related services; examples are the French Traduc and the Dutch OpenTaal. These relations are often not formalised, but they are really helpful in making sure everyone is somewhat following similar rules. As part of FreeDesktop an American initiative could be started, which could keep a list of the standard meanings and uses of words.

The solutions above are just ideas for ways to deal with a problem that we should give much more attention than we have done so far. Admittedly, it is easy for me to talk from the sideline, knowing that I—being a native speaker of the Dutch language, not of US English—will never be doing much of the work I propose. But I do hope that some people will be inspired by this piece and do something with it.

Those poor Americans deserve localisation too!

]]> 0
Canonical and GNOME: the Atlantic chasm? Sun, 13 Mar 2011 19:55:30 +0000 While contemplating the tensions between Canonical/Ubuntu and GNOME that a lot of people have been blogging about I just thought had an insight. I have thought of what we are observing here as a clash of cultures before, but merely as a clash of company cultures. However, aren’t we observing something that has deeper cultural roots?

GNOME was founded by two Mexicans and currently seems to be predominantly dominated by people from the United States, with the foundation itself being based in the States. The most important companies behind it, RedHat and Novell, are both from the United States too.

Ubuntu was started and Canonical was founded by a South African based in London, where the company has its headquarters. Although there is again a very high American presence within the community and company, the leadership is much more eclectic than GNOME’s. Furthermore, the Canonical Design Team seems to be predominantly British.

Why would this matter? Our differences aren’t very large after all, the open source community is dominated (unfortunately) by white, Caucasian males and they have a lot in common. I think it may play a more important role than we have thought so far. Communication is very culture-bound and it seems that it is communication that has been causing most of the problems.

If we look at the rejection of ‘libappindicator’ as an external dependency, we may be able to see this more clearly. Canonical, say some people in the GNOME project, failed to push its inclusion thoroughly enough. They may have done what was formally required, but didn’t show the initiative that could have resolved the issues that were raised. They say that you need to find the right people to talk to, not expect a machinery to process your request once you’ve delivered an appropriately tagged package.

Canonical reiterates that it did what was required to propose a module as an external dependency. They say that they want to have someone to talk to, to have someone in charge who makes the decisions and can be phoned up if necessary.

Both parties expected different things from the other. This may be what caused the unease. Each party feels that it did enough and the other too little, so no one is to blame.

Strikingly, it seems that the cooperation concerning the application indicators/status notifiers with KDE—founded in Germany, its foundation still being based there—was very productive. Was this because of the persons involved, or because of the cultures? The communication ways I described above do seem to reflect the stereotypes of the two continents.

What should be said that the above is a gross generalisation. Generalisations can usually only be used, with great care, when you talk about large groups. In this case it might be better to talk about individuals instead.

I don’t think that what I said above is the whole explanation. I do think, though, that it is something we should keep in our minds. It may not only have played a role in worsening the unease and misunderstanding here, but it affects all communities that are truly diverse. Traditionally, FOSS seems to have been dominated predominantly by people from the US. Now that is changing, more people learn English—as an example, my father’s generation learned German, not English, as the most important foreign language at Dutch secondary school, for me it is English—and ‘developing’ countries are catching up.

Cultural differences will be more visible in communities, we need to be aware of the different ways different cultures communicate if we want to make sure no contributions go to waste.

]]> 15
The uselessness of being the better person Fri, 11 Mar 2011 19:59:17 +0000 If you’re the only true follower of the philosophy you promote, if you are the only person who is true to the spirit of your morally better culture, if others are treating you wrong but you treat them right, then you are the better person.

However, what if that philosophy of yours is hindering you while achieving your goals? What if your morally better culture is closed and looks down upon people who behave differently, scaring them away? What if your right is the wrong in someone else’s eyes and there is no way to prove that wrong objectively?

Doesn’t your moral perfection, your superiority to all those inferior hypocrites make it harder to enjoy what you like to do, to reach the people you want to reach, to have success where you want to succeed? Doesn’t that make your moral perfection useless? Doesn’t it mean that those people with their flawed human nature might be achieving more with their sensible, flexible, pragmatic approach?

]]> 5
Canonical and Banshee: making money with others’ open source Fri, 25 Feb 2011 13:30:37 +0000 The recent fuss about the division of revenue from Banshee’s Amazon MP3 store made me think about the moral right of making money with help of the open source code written (partially) by others. In this post I would like to explore this issue, by the example of the Banshee Amazon MP3 plugin, and Canonical’s rights to change the affiliate code.

The case

Banshee’s Amazon MP3 store plugin was developed by Banshee star-developer Aaron Bockover, who announced on his blog last August that all revenue of the plugin would go to the GNOME Foundation. The plugin consists of two separate extensions, one for integrating music importing from Amazon’s MP3 store into Banshee, the other for embedding the store’s website. Both are open source, and available from Banshee’s GIT branch.

After discussions between Canonical and the Banshee developers, Jono Bacon announced on his blog that the final settlement was that Canonical would receive 75% of the revenue of both music stores, and direct 25% to the GNOME Foundation. Some people were outraged by Canonical taking such a large share of the revenue, arguing that the company was simply profiting from the work of others.

When are you allowed to sell?

I want to investigate this issue by going from the bottom up. Let us first establish why we pay money. We can’t do everything ourselves, because we don’t have infinite time and skills. Therefore we use the services of others, and pay them in exchange for what they produce. That money allows them to buy the products of others, so they can focus fully on their job. Money is thus awarded for a service.

In open source, most of the time you will not have to pay for the software. However, the GPL license does not prohibit selling your software. The Free Software Foundation defines free software not as ‘gratis’ software, but says software is free when a user is free to run the program, change the program, and redistribute the program with or without changes. (Read its piece on selling (free) software if you want to know more.)

When are you entitled to sell?

You pay money in exchange for a service. In the case of the Banshee Amazon MP3 plugin, Amazon gives a share of the revenue to Banshee, as a reward for bringing users to its store. Banshee subsequently chooses to give the revenue to the GNOME Foundation. Note that it is not the end-user who is the customer here, but Amazon!

Under the current plans, the Banshee Amazon MP3 plugin on Ubuntu will give Canonical 75% of the money paid by Amazon and the revenue of the Ubuntu One Music Store. The GNOME Foundation, via Banshee, will get 25% of both. I shall focus on the Amazon MP3 plugin. There are two ways to look at this. The first way is to consider Banshee an involuntary customer of Canonical, buying the service ‘broader access to customers’. The win for them is more income. The second way is to consider Amazon a customer of both Banshee and Canonical, who jointly provide the service Amazon pays for.

How does this happen? The Banshee Amazon MP3 plugin, developed by the Banshee project, is the direct means used to make the Amazon MP3 Store available. Other important factors are the attractiveness of Banshee—courtesy of its developers—and distribution via Ubuntu, the most popular Linux distribution on the desktop.

We have seen that both Canonical and the Banshee project deliver a part of the service that Amazon pays for. Canonical is the final distributor, bringing the product to the customers’ doorsteps, Banshee can be compared to a more specialised producer, providing a specific product to the distributor. If we look at the real world, we can see that it is often the distributor at the end of the chain that determines the prices. Farmers, for example, earn often very little for their crops. Most of the revenue on produce goes to the supermarkets that distribute the goods to the customers. Supermarkets may not be the sole method of reaching customers, but they are by far the most important channel; the farmer depends on the supermarkets. This simple fact allows the stores to dictate the prices. It is an economic law that says that when a good—in this case access to the customer—is scarce, the costs will go up. Here it means the costs for the farmer will go up in the way of lower revenues.

Canonical can be compared to the ‘Superunie’, the joint procurement organisation of the major Dutch supermarkets. Like supermarkets, it doesn’t actually make everything it offers itself. Instead, it is responsible for the selection, integration and fine-tuning of the components, and maybe for baking the fresh baguettes. Its large market share in the Linux desktop world gives it a lot of power. Some people are principally opposed to it and say it abuses its power.

Access to many potential customers makes Canonical’s contribution to the ‘service’ provided to Amazon much, much more important. It is very likely that 25% of the Banshee Amazon MP3 plugin’s revenue when enabled by default will be higher, than 100% of the same plugin disabled by default. The service of enabling the plugin by default is therefore a valuable ‘product’, which is sold to the Banshee project at a not unsubstantial price.

This high price can be justified by the fact that Canonical is selling a scarce good to the Banshee project. However, Banshee has little choice than to accept whatever benevolent offer Canonical deigns to make. Because they’ve chosen for a free license, there is no real transaction to be made. If Canonical doesn’t like what Banshee demands, then it can just replace the affiliate code and keep everything for itself. Banshee is powerless. That is the difference with the farmer-supermarket analogy, in which the farmer can decide to reject and offer and not give his or her produce.

So, what  amount can you ask for this substantial additional value? It is impossible to determine the true economic price of it when only one side can make demands. The 75:25 ratio is therefore not a representation of the true values of what both sides have to offer, but instead the representation of what the only party with any power over the matter considers the values to be. It is a subjective determination.

Whether or not you agree with the chosen ratio depends what value you attribute to the services provided by Canonical and by the Banshee project to Amazon. It is not possible to do this fully objective, and in any case you need extra data to say something definitive.

To me the demands from Canonical don’t seem very unreasonable at all. The value of the huge user share Ubuntu has to offer seems to be worth the 75% slice at first glance. However, we’ll first have to see the statistics from the Amazon MP3 plugin in action on Ubuntu to verify this assumption. If it turns out that Ubuntu brings in a lot of revenue, then the 75% fee is justified. If it turns out that the revenue is relatively low, or average, then Canonical’s share should be lowered to compensate for the proven lower value of the ‘service’ offered by the company. I would propose to do this check not too long after the launch of Ubuntu 11.04, make the results public and swiftly announce change when change is justified.

What do you think? Do you agree with my conclusion? Did you spot any mistake? Please leave a comment!

]]> 54
Banshee 1.9 (future 2.0) with DAAP music sharing Thu, 24 Feb 2011 17:29:04 +0000 It has been more than two years since I’ve published a post about DAAP music sharing with Banshee 1.4. Judging from my site’s statistics, that still is a very popular piece. However, things have changed a bit since November 2008, so I think that it is time for an update, even more now awesome Banshee is the default media player in Ubuntu.

You should still keep in mind that Banshee does not have a DAAP server of its own. Its DAAP plugin can only consume shared music. One thing that may have been added since 1.4, though I’m not sure on this, is the possibility to connect to remote DAAP servers. Go to ‘Media->Add Remote DAAP Server’, enter the domain name or the IP address of the server, make sure the server port is correct, and press OK.

DAAP servers on your local network should be detected automatically. How do you set up such a server? In my post from 2008 I recommended Tangerine. However, apart from a fix to the menu shortcut in January this year, there has been no active development on it since August 2009. The project seems pretty dead.

However, the only other DAAP server I can find in the Software Centre on the Ubuntu 11.04 alpha release, is ‘mt-daapd’. Its description says it is part of the Firefly Media Server, but according to Wikipedia the latest release of that project comes from 2008. The website mentioned in the description does no longer exist.

What alternative do we have? It seems that we have none but to stick with Tangerine. On my system it worked, and I am running the Ubuntu 11.04 alpha, so new software versions don’t seem to have broken things.

Setting Tangerine up

Tangerine Media Sharing

My Tangerine settings

First install the ‘tangerine‘ package, either by searching for it in the Software Centre, or by looking the package up in Synaptic. Tangerine will automatically start when you log in, all you need to do is configure it properly. Launch ‘Tangerine Media Sharing’, either by searching for it in Unity’s Applications place, or from—if you’re using GNOME 2.x—’System->Preferences->Tangerine Media Sharing’, and set it up to take its songs from Banshee. You might want to compare your settings to mine on the right. The Tangerine service starts right away when you enable music sharing in the settings.

Banshee’s DAAP extension is enabled by default, so it should show up automatically as soon as Tangerine is started. However, it seems that currently it crashes/freezes when Tangerine is started while Banshee is running. Restart Banshee and everything will be fine. I have been able to share my music without any problems.

(Note: it seems that for now the MPRIS extension is causing problems for Banshee in Ubuntu 11.04 alpha. When playback won’t start, disable that extension, restart Banshee—since it will have frozen—and try again. This will break Sound Menu integration, though.)

]]> 5
‘Ubuntu Linux’, ‘Ubuntu GNU/Linux’? No, use ‘Ubuntu’! Fri, 18 Feb 2011 23:44:34 +0000 Reading the title you might wonder why I would want to risk the possibility of yet another flame war between “Linux” and “GNU/Linux” proponents. The reason for this is that I am not only choosing neither side, but also because your brand name is very important.

When people talk about Ubuntu, they usually have two ways of referring to it. Most frequently people use ‘Ubuntu’, but at some places you’ll find ‘Ubuntu Linux’ very consistently. This may be a relic of the past, after all the Ubuntu 4.10 Warty Warthog release announcement has links to rather than In this blog post I want to argue here to stop calling Ubuntu ‘Ubuntu Linux’. There are several reasons to do so.

The first, and the most important argument considering we’re trying to market a product here, is that using simply ‘Ubuntu’ makes the brand name a lot more attractive and easier on the mind. Because of the Linux in ‘Ubuntu Linux’, people will associate it with the legacy of past Linux distributions, and I think that ‘Ubuntu’ is a more attractive name on its own.

The second argument is about being sensible about attribution. Several people have said that Ubuntu should very purposely be marketed as ‘Ubuntu Linux’ to give credit to the Linux developers. Other people, who feel that GNU contributed a lot of code to the base of most operating systems using the Linux kernel, even say that we should try to promote Ubuntu using the impossible name ‘Ubuntu GNU/Linux’. However, why would we attribute GNU and Linux, but not GNOME, FreeDesktop, Mozilla, The Document Foundation, Novell, Red Hat or any of the other projects and companies that all contribute to what is ultimately integrated into one Ubuntu? Does GNU really deserve more to be in the name than GNOME? Isn’t the browser the most important tool of the desktop nowadays?

Putting either Linux or GNU and Linux in the name is not fair at all. There is no reason why those vital projects should be attributed, whereas other projects that are just as vital are not.

The third argument is practicality. Your headlines will be a lot shorter when writing ‘Ubuntu’ instead of ‘Ubuntu Linux’.

The fourth argument is conformity. If we want to bring a consistent message, we all should follow the same naming scheme. This is very important. When someone is talking about ‘Ubuntu’, and another person is discussing ‘Ubuntu Linux’, then you create confusion. “Is Ubuntu the same as Ubuntu Linux, or is it something different?” a person might wonder. Ubuntu Linux could very well be a derivative of Ubuntu! We should pull our act together and make sure we deliver a consistent message. Canonical and most people are using ‘Ubuntu’, therefore it makes sense to stick to ‘Ubuntu’.

Why write this blog post about to propagate a convention that is already dominant? The reason for this can be found in the fourth argument: ensuring consistency. Use ‘Ubuntu’, not ‘Ubuntu Linux’! When you see ‘Ubuntu Linux’ being used somewhere in a description, on a wiki page or elsewhere, fix it! Make sure the public knows we’re all talking about the same product!

]]> 82
Retiring from the Ubuntu community Sun, 30 Jan 2011 16:25:46 +0000 Since I first used Ubuntu, in December 2005, a lot of things has changed. Ubuntu has improved tremendously and improved more than I could imagine when I booted from the Breezy Badger separate Live CD, to see what this Ubuntu looked like. In the five years I’ve been a member of the Ubuntu community, a lot of things have changed. Although many people have come and gone, the size of the community is increasing every day, and the structure has changed over time, the spirit is still the same. It is a welcoming place full of nice people who are enthusiast about making something great.

The year 2010 has been a tumultuous one for me. On 2 February I finally became an Ubuntu Member, in May and October I attended the two UDSes, and in July I helped the organisation of GUADEC 2010 in The Hague. In October I also became the LoCo Contact of Ubuntu Nederland.

The year 2010 was also the year that my final year at secondary school started. An important transition in anyone’s life. It is very important for me to score high on my exams, but apart from the need for more focus on school I’m also changing as a person.

When I joined the Ubuntu community I was in my first year of secondary school. As someone diagnosed with a mild version of Asperger’s syndrome and ADD in an unfamiliar environment, I didn’t make a lot of contact on school right away. In a new social environment it takes a while before I learn what to do.  The Ubuntu community was an open, welcoming place where I could find company and kind people to talk to, without the fuss that accompanies real-life conversations.

During the years I contributed to Ubuntu I learned a lot and met many kind people. It has been a wonderful experience to have the privilege to work together with such great persons. However, the last five years have been my puberty years, so it would be strange if those would have left me unchanged. As my social skills improved and my school life started to become busier and busier, I felt less and less need to be present in the Ubuntu community.

My lack of time and the decreasing need for community aggravate an increasing lack of motivation. Often I sit behind the computer, feeling bad about myself because I feel I ought to be contributing to Ubuntu, while I’m not. This makes me associate feelings of guilt and dissatisfaction with Ubuntu. I do not want to associate negative feelings with Ubuntu.

I do not want to waste energy and time that I need so much for my final exams. I do not want to turn my great memories of Ubuntu into something bad. I do not want to disappoint people who expect me to do things. Therefore, I have decided to retire from the Ubuntu community and stop before things go wrong. I feel a kind of melancholic sadness while writing this, but, to be honest, also some kind of relief.

I will stop all my work for Ubuntu in the international community. I will not quit as Ubuntu Nederland LoCo Contact. My work for that community will continue as usual for now.

Jorge, the Unity Places API still looks awesome to me, and probably will be playing with it when it becomes public. Now I’ll feel free to do just what I want, so I may be playing a bit more with code. Writing a Unity Place would be a fun exercise.

To all those countless people who I met in the Ubuntu community, I want to say: thank you for being there. Ubuntu was fun because of you. Thank you so much. There are many people who helped me out, but I want to especially thank – in alphabetic order – Carlos de Avillez, Jorge Castro, Jan Claeys, Laura Czajkowski and Dr Vish for their patience with me and for their indestructible enthusiasm, which has been my source of motivation during those years.

I am sorry if I disappoint you by leaving. I do not think that this leave will be forever, maybe I will see you again later, when I come back.

]]> 23
Open source communities: assuming intelligence? Sun, 23 Jan 2011 13:46:42 +0000 Ubuntu is growing, there is no doubt about that. As the number of users grow, so does the number of potential community members. Not all Ubuntu users choose to spend time in the community, but a part does. This group is expanding with the number of users, and we are noticing this not only from the UDSes becoming busier and busier. The Ubuntu Beginners Team and Ubuntu Bug Control have both set up mentoring programmes to guide the flow of new contributors.

As we’re welcoming these new people to the Ubuntu community and introducing them to the various tasks there are, we can see a set of problems arise. There is the problem of increasing scale—community manager Jono Bacon has written about handling this on his blog several times—and the issues introduced by the increasing diversity of the community. I’ve written before about the effects of language differences, but there is something else we should realise, which shows itself more and more now the composition community diversifies.

Not everyone is equally intelligent. When considering other people, it is very hard not to assume they think like us, and judge from that viewpoint. Especially when you are intelligent it may be very hard to realise there are also a lot of people who have more trouble thinking. Life is not like Sims, where everyone has an equal amount of character points, only distributed across different character traits. With the knowledge of the human we have nowadays we can only conclude that ‘life’ is not fair, because not everyone gets the same amount of character points. Some people are smart and beautiful and nice and happy, and other people are stupid, ugly, unpleasant, depressed. Both situations are rare, but both can occur.

This is an important thing to consider in a community that seems to assume intelligence. Every contributor is welcome, which is a good thing, because we don’t want to shut someone out. Every contribution helps us. We can see this in the mentor programmes too. If you create a page at the Ubuntu Wiki and sign the Code of Conduct, you’ll be assigned a mentor. You do not have to sit an exam. At the Ubuntu Developer Summit we are discussing the future of Ubuntu almost as equals, and there are a lot of people that have something interesting to say.

However, what we are doing is not easy work. Building an operating system, writing applications to run on it, running a community, these are all tasks that require skills. Skills that not everyone has. We do, apparently. That makes us more intelligent than the average person, because that is how hard it is to do what we do.

Not all prospective bug triagers, MOTU applicants or passionate artists have got what it takes to build something like Ubuntu. A bad bug triager only causes more work, an unskilled MOTU ruins our credibility, ugly artwork scares users away. Even when the triager, packager, artist is a very kind person, we should say that.

I do not doubt that the community members are honest enough to tell people when they are not delivering good work. But that isn’t the only side to this. Say you are telling someone (s)he’s triaging bugs badly, you’ll first have to have seen badly triaged bugs. That means someone will have to correct those bugs, if the reporter hasn’t already stopped responding. It could also mean that someone has spent some time trying to mentor the bad triager. Valuable time of a volunteer has gone to waste. That is also bad for the motivation of the volunteer.

We should not make intelligence a requirement for joining the community, we should not look down on people because they happen to be less intelligent, we should not become an elitist club. We should, though, make sure we watch out that we do get the right people at the right place. We should realise that not everyone has something to say that is worth listening to, though you cannot make that judgement until you’ve heard what the person said. We should prevent frustration for volunteers and prospective volunteers by being clear about what it takes to join the community.

We should not assume intelligence.

Do I want to say something is wrong with the Ubuntu community? No. I do not think that there are things we need to change to fix the issue I just described. Yet. We should be watchful, considerate and aware that not everyone is equal. The sentence “Every man was created equal.” is wrong, because men where not created, just like women, and they are not equal as well. Everyone is unique, in a positive or negative sense.

]]> 13
The Economy of Free changes our roles in the free market Wed, 12 Jan 2011 21:11:06 +0000 Despite its many flaws, of all the simplifying models we have available, the free market has gained the most support and apparently seems to be working the best, if used with caution. Most economies in the world are being run with the assumption that the free market theory is generally true.

Generalising, from a consumer perspective the theory works like this: there are two parties, consumers with money, and vendors with products. Both parties want to have much of what the other party has. The vendor tries to tempt the consumer to buy the product, by making the product more attractive to the consumer. This could mean that the product becomes technologically better, or that it becomes superficially better, which still makes the consumer happier. Because of competition for the attention of the consumer between the vendors, they have an extra motivation to keep improving.

The vendors are trying to please the consumers as much as possible, because the consumers are the people who decide where the money will go. With every purchase, you vote for a specific product. You reward or punish decisions made by the vendors. If a certain decision is bad for the consumer, the vendor will sell less of the product. This is a motivation to do the consumer no harm.

The rise of the internet has made it possible to transmit information for a price that approximates zero, in case of high volumes. It is very easy to reach a very large audience for a very low price. This has made it alluring for vendors to start off giving their product away for free, out of idealism, or because the vendor just likes making the product. The positive reaction of consumers to this has led to the rise of the economy of free. More and more services are available for free.

However, even hosting a very small website is not free. You have to pay for it to stay up. It becomes more expensive when your audience grows. Further costs are added when you want to deliver high quality content, or content that is in popular demand. YouTube is free for us to use, but Google does pay license fees for the music hosted on the video site.

This money has to come from somewhere. In the economy of free it is not the user who pays for the product. Usually the advertisers fund the vendor, but often also investors. YouTube and Google earn a lot of money with advertisements, and so does Facebook. Another example is Ubuntu, also given away for free to regular users. Canonical earns money from delivering various services to companies, and gets it from its investor and founder.

In our little free market model the properties and the quality of the product where determined by the consumer. The consumer pays, so the consumer chooses which products stay, and which go. However, in the economy of free we have a different payer: the companies and persons buying ad space and investing money.

We are just as much the customers of Facebook, as cows are the customers of milk farmers. A farmer has interest in the happiness of his cows, but in the end they are not who they have to please. The people who buy the produce of the cows, the milk, are the real customers. At Facebook we are the milk cows. Facebook’s product is its user population, and the information put online. You might say that we pay with our personal information, but personal information alone is not enough to pay rent and wages, you need cash for that.

Ubuntu’s product is the happiness it provides its founder, and the services provided by Canonical to companies. Its regular users are by no means comparable to milk cows, but they are more a tool than the actual consumers. I don’t mean tool in a negative way, I mean to say that the regular, non-paying Ubuntu users are needed for the cultural, emotional aspects of Ubuntu, for its important community. But in the end, this also does not pay the bills.

So the people we think of as the consumer are in the economy of free no longer the consumer. This changes their influence on the products they use. No longer are they voting directly for a certain product. You can go to another Facebook, but if all advertisers have a certain basic requirement of their product, and all social networks need advertisements to run break-even, then you will never be in control like you are in the super market.

Our role is different, in the economy of free. We are the product. Especially when social pressure is high, we lose part of the say this simplifying model to attributes to us, and ultimately we lose it to the vendors.

Disclaimer: this is not an attack on either Facebook or Canonical, it is just an observation I would like to share with you.

]]> 8
Changes to the One Hundred Paper Cuts project for the Natty cycle Fri, 19 Nov 2010 12:43:12 +0000 As you may have noticed from Ivanka Majic’s article on the Canonical Design blog asking for contributors to the One Hundred Paper Cuts project, we are running it again this cycle. We will try to fix as many nuisances in the Ubuntu desktop as possible.

During the Ubuntu Developer Summit in Orlando, Florida, we discussed how we wanted to continue the project this cycle. This session resulted in some changes to the requirements of what constitutes a valid paper cut.

We now accept all trivially fixable usability bugs that the average user would encounter in a default application of Ubuntu or Kubuntu OR in any of the featured applications. This expands the number of possible targets and makes sure that we can also look after parts of the desktop that are not included by default, but which will get a lot of exposure.

We also defined our focus as being on applications and upstreams. Wat this means is that we want the project to be especially for making the applications we ship work great, and that we want to forward our solutions upstream and cooperate with the upstream projects whenever possible. Projects who would like to work together should contact Jorge Castro. Do note that the One Hundred Paper Cuts project is only for applications that are provided in Ubuntu by default, or are recommended in the Software Centre.

More details and guidelines can be found at the One Hundred Paper Cuts project’s wiki page. If, after reading this page, you still have some questions about the project, please don’t hesitate to ask on IRC—in the #ayatana channel—or in a reaction to this blog post.

]]> 1
My goals for the Natty cycle Mon, 15 Nov 2010 06:00:07 +0000 It is now two weeks after UDS in Orlando and the test week at school ended yesterday, so I now have time again to plan for the Natty cycle. Previous cycles I’ve been very unfocused and I did a lot of small, different things. This cycle I want to focus and to help me with that I would like to blog about the goals I set for myself.

This cycle I will probably be giving most of my attention to my role of LoCo Contact for Ubuntu NL, which I assumed Saturday 30 October 2010, while waiting for my plane in the Caribe Royale lobby. I aim to strengthen the structure of the LoCo, making sure we finally get a clear organisation and can act decisively.

That is my most important goal, but there are more. I want to work together with Vish to make sure the One Hundred Paper Cuts project will be a success for Natty, just like it has been for the past few cycles.

The Bug Squad Mentorship programme will see some experimenting to test ways to improve this vital project. It is my goal to help to make the programme ready for starting to deliver good new triagers to the Ubuntu community.

My last goal is to contribute more to the Ayatana project, especially to Unity and to help Unity Places gain traction. I’m not exactly sure how I will contribute to Unity, but it is a great, fun way to learn something new.

These are my main goals and this is what I will be focussing on. I will probably do something to previous areas I contributed to, but not as much as I used to. Since this is my last year in secondary school, I have less time to spend and I also want to spread my contributions not as thin as before.

In half a year I will review my goals and see if I succeeded!

]]> 0
Language barrier is an innovation barrier Sat, 13 Nov 2010 15:54:33 +0000 This is a though I just came up with. I haven’t given it much thinking, but I would like to share it with you anyway, just to make you think again.

Website popularity usually goes one way: from the English world to the non-English world. Non-English websites can become very popular within their own cultural boundaries, but rarely get a lot of traction outside their cultural boundaries. English, and especially American, websites start in English-only mode. Then they get popular in the reasonably large English-speaking community. Next they get something magical or special for the non-English part of the world, partially because they are promoted in English popular media, which the rest of world follows very closely as well.

This gives them a huge edge. Example: a very popular Dutch social media network is Hyves. It is very popular under secondary school students, because it provides an easy interface to do what they want to do: share pictures, status updates and leave messages at each others walls. It is not run by a creepy guy who sells your personal details to the advertisers, although it was recently bought by the Dutch ‘Sun’.

The English Wikipedia on Hyves: “In May of 2010 Hyves had more than 10.3 million accounts (corresponding to two thirds of the size of the entire Dutch population which stands at over 16 million in 2010), with growth of over two million members compared to the previous 1.5 years.” It is the most popular social network in the Netherlands and it obviously does something right. It also offers an English version of its website, but that never became very popular.

Lets look at Facebook. Founded in 2004, but opened to the wider public only in 2006, a year when the Dutch police already started to use Hyves to locate suspects. It would be still a while until Facebook would get so popular that stories about suspects sharing their location on Facebook would emerge, but Hyves already had them. Facebook grew to half a billion users very quickly, many more than Hyves’ 10.3 million people. However, now more and more Dutch people are moving to Facebook, and I do hear that many people find its interface confusing and don’t know where to find things, considering it a more limited platform. (Although FarmVille makes up for that for some people.)

Why did Hyves, which seems to be favourite still amongst secondary school students, and also, though less, amongst adults, never gain enough traction in the English world to become popular? You, reader, probably never heard of this network before. I believe that this is caused by the language barrier, by the fact that the international culture, the one world that globalisation is said to have brought us, is mostly a one-way culture.

We consume the American culture, our own culture is heavily influenced by it, but because no American really follows Dutch culture, Hyves could have never gained enough fame abroad. Many Dutchmen and women gladly joined Facebook, despite its lack of localisation, but that would simply be not possible for the people that do not belong to the 27 million people that speak Dutch to use it. I read English-language news blogs about English-language social networks, but I do not read Swedish news blogs about Swedish social networks either.

This does not apply just to one social network, you can see it in many places. Companies like TomTom are the exception, not the rule. It takes a lot of effort and translating to market a non-English project or product in the English-speaking world, but it takes no effort at all to market an English project in the non-English world. Why? Because apparently we want to be like you, and you too.

EDIT Referring back to the title of this post: because of this barrier, innovation also flows one way. It can float from the non-English world to the English world, but much, much harder than the other way around. This causes innovative initiatives outside the English culture to stagnate and limits the pool of potential the world can tap from.

]]> 7
Ubuntu Developer Summit Natty, Thursday and Friday Mon, 01 Nov 2010 21:56:54 +0000 The Ubuntu Developer Summit is over! I’m sitting at home now, and after a day of recovery tomorrow the regular life of school and homework will start again. And how it will start! Next week is a testweek full of school exams, so I have quite something to prepare. But before that all starts I still have something to do: report the last two days of the UDS to you. Apologises for the delay.


Thursday started with a community roundtable. The Beginners Team was one of the things we discussed, and I have to say that I was amazed by the large amount of good work they have done, without many of us taking notice. They really help newcomers to the community to find their place.

During a session on the Unity Places API we learned how much possibilities this interesting software offers. The most exciting piece of technology was the great ‘libdee‘, which allows you to share a table of information over DBus and use it in several applications at once. The table automatically stays synced across all applications! Really cool stuff.

During the ‘Indicator Framework Changes for N’ I learned that the keyboard shortcuts nightmares will go away in Natty, because of the move to Compiz. We can manage the shortcuts there, so things should be working much more smoothly.

Caribe Royale, as seen from the smokers' spot in front of the UDS convention centre

Caribe Royale, as seen from the smokers' spot in front of the UDS convention centre.

Thursday had two great plenary talks, from Allison Randal and Ivanka Majic, respectively Technical Architect of Ubuntu and Canonical Design Lead. Allison gave a general talk about how we are different and reminded us of our ways. She reminded us to make sure we can look at things from a different perspective. Ivanka told us about making opinionated decisions, i.e. making choices for our users. Our users doesn’t want to configure everything theirselves, and sometimes when they want we shouldn’t let them. If you limit yourselves to what your users need and make sure the users are not confronted with difficulties they cannot solve, your product will become much better and a lot more user-friendly.

The two-hour Bug Squad Roadmap session was a bit of a marathon, but it was a great session. We are planning on some changes to the Bug Squad Mentorship programme, we will run a trial of putting a few mentors and students into one group and see how that goes. Also, I mentioned that bug triaging is still too much of a first step, a first hurdle, on your way to MOTU-stardom. Now, I don’t want to say that MOTUs aren’t stars, but I do think that bug triaging also needs some status. Why? Because we need more and better people. We could make the requirements for becoming a bug triager stricter—Pedro is looking into bringing some clarity to the application requirements—but we also need to make sure that in our communication we show that bug triaging is something on its own, worthy of mentioning on your resume. Bug triaging is not easy, it requires a lot of knowledge and effort to become a good triager.


Friday we discussed how to handle non-English bug reports. We didn’t make any fixed decisions this session, but we will take a look at using Launchpad Answers to solve our problem. Bug reports would be recommended to use Launchpad Answers to report their bug in their native language, their local community could then help with translating the issue into a bug report and translating the communication.

We had a very interesting session on the results of the Unity Usability tests. It turned out that many users had problems with finding out how to drag icons. They tried to drag them up or down in a straight line, but that was not possible. Also, it was highlighted how extremely difficult users find it to launch an application they just installed from the Software Centre. They first look for more information at the “More Information” page of an application, then try the website—which is often not very useful to a user—and then the screenshot. Charlene said that some users even thought the screenshot was the real application!

My bed in Caribe Royale, during UDS-N

My bed in Caribe Royale, complete with its five pillows

The lightnight talks were really good this time. If you want to use threading and pipelines in C, use libpipeline! If you have a question about Ubuntu, go to AskUbuntu! If you want to design an awesome interface in Qt, use QML!

There were only two sessions left then: a summarising QA Roundatble and an interesting session in which we examined the needs of the Ubuntu community with regard to AskUbuntu—I hope to see localisation coming to AskUbuntu in this cycle. After those two sessions we had the closing procedures and then it was over! Done, finito, voorbij!

I took flight MP636 from MCO to AMS, which left at Saturday 30 October 19:30 EDT and arrived at a bit before 9 ‘o clock, CET, at Sunday 31 October. Today I didn’t go to school, but instead slept until 12 ‘o clock. I’m curious to find out tonight how far my jetlag is gone already.


]]> 5
Onduidelijkheid LoCo-contact Ubuntu NL Fri, 29 Oct 2010 22:42:47 +0000 Sinds het mislukken van de eerste herkeuring van Ubuntu NL en het feit dat de gemeenschap pas daarna in de gaten kreeg dat er een herkeuring plaats had gevonden, hebben voor enig onrust gezorgd in de Ubuntu NL gemeenschap.

Op dit moment is het probleem voor Ubuntu NL nog groter geworden, want de LoCo-raad is maar niet in staat om ons LoCo-contact, Dennis Kaarsemaker, te bereiken. Zelf hebben wij ook pas deze week voor het eerst in maanden weer zeer kort contact gehad met Dennis, toen een teamlid snel met hem belde.

In dit bericht wil ik de onduidelijkheid wegnemen die er op dit moment is door uit te leggen wat er gebeurd is, te vertellen wat er gedaan is en te zeggen wat we van plan zijn.

Want waarom is er nou eigenlijk een herkeuring? Een tijd geleden is het beleid ingevoerd om elke lokale gemeenschap om de twee jaar opnieuw te keuren. Dit om te verzekeren dat het niveau nog steeds hoog genoeg is en dat het team nog steeds de officiële status verdient. Dennis is Ubuntu NL zo vroeg in de geschiedenis van Ubuntu begonnen dat het nog nooit gekeurd is.

De herkeuring op 20 juli (logbestanden) was dus de eerste keuring die Ubuntu NL onderging en hij mislukte, niet omdat de LoCo-raad Ubuntu NL erg slecht vond, maar omdat er niet genoeg documentatie was. Voor een herkeuring moet een LoCo-team een wikipagina aanmaken met daarop informatie over hetgeen er sinds de vorige keuring gedaan is. Onze pagina was slecht. Er stond veel te weinig op, en het gaf ook niet alle informatie die gevraagd werd. Dat was ook niet zo vreemd, want de pagina is pas vlak voor de vergadering aangemaakt. Ter vergelijking wil ik even wijzen op de pagina van Nicaragua.

In de vergadering van 20 juli is niet besloten om Ubuntu NL de officiële status af te nemen, maar om ons nog een kans te geven. Er zou een vervolgafspraak gemaakt worden voor augustus of septemeber. Dat werd september en oktober, en nu is het al 29 oktober. Dat is niet omdat de LoCo-raad zijn werk niet doet, of niet probeert om tot een afspraak te komen. Dat is omdat het de LoCo-raad maar niet lukt om met ons LoCo-contactpersoon, Dennis, in contact te komen. Een LoCo-contactpersoon is verantwoordelijk voor het contact tussen een LoCo-team en de LoCo-raad en de internationale gemeenschap.

Ik wil eerst heel duidelijk maken dat ik vind dat we Dennis dankbaar moeten zijn voor zijn vele werk voor Ubuntu en Ubuntu NL. Hij heeft de lokale gemeenschap opgericht en opgebouwd. Dat is niet makkelijk werk, en het kan soms erg frustrerend zijn om een groep op te bouwen vanaf te grond. Daarnaast heeft hij ook bijgedragen aan de infrastructuur van niet alleen Ubuntu NL, maar ook Ubuntu. Echter, als teamleider heb je ook de verantwoordelijkheid om op te stappen wanneer je geen tijd meer hebt. Gezien de totale afwezigheid van Dennis de afgelopen maanden, zo niet jaren, is hier iets misgegaan. Dat had anders gemoeten.

Dennis heeft wel al verschillend keren, waaronder op de wikipagina die geschreven is voor de herkeuring, gezegd dat hij een vervanger zoekt voor zijn rol als LoCo-contact. Echter, het is er nooit toe gekomen dat hij een vervanger heeft uitgekozen, of überhaupt direct aan de gemeenschap heeft meegedeeld dat hij naar iemand zocht.

Ondertussen moet er nog wel steeds een herkeuring geregeld worden. Ik ben nu in Orlando, Florida, waar net de Ubuntu Developer Summit is afgesloten, en ik heb hier onder andere kort over onze situatie gesproken met Laura Czajkowski, een lid van de LoCo-raad. Zij liet me weten dat het geduld van de LoCo-raad niet eindeloos is. Er was nog steeds geen bericht van Dennis en wanneer er niet snel een bericht zou komen, dan laten we de LoCo-raad geen andere keus dan Ubuntu NL af te keuren.

Toen ik dat hoorde heb ik het snel gemeld op het forum en zijn we gaan proberen om contact op te nemen met Dennis. Uiteindelijk heeft Ronnie donderdag kort met Dennis gebeld. In dat bericht is afgesproken dat Dennis eerst met Sebastian zou bellen en daarna de LoCo-raad en mij zou gaan mailen om de positie van LoCo-contact aan mij over te dragen, in lijn met hetgeen besloten is op de vergadering van 10 oktober.

Wanneer dat geregeld is dan zal ik zo snel mogelijk proberen om een afspraak te maken voor de herkeuring. Ik zal proberen dit zoveel mogelijk te doen in overleg met de gemeenschap, zodat er mensen bij kunnen zijn die dat willen. Met onze mooie herkeuringspagina denk ik dat we er zeker wel door kunnen komen.

Het heeft dus allemaal veel te lang geduurd. Om te voorkomen dat het nog veel langer gaat duren heb ik een aantal deadlines gesteld, in deze blogpost: wanneer ik morgen aan het eind van de middag(Nederlandse tijd) nog steeds niks van Dennis gehoord heb dan stuur ik hem nog één e-mail. Wanneer aan het eind van zondag ik dan nog steeds niks gehoord heb zal ik contact opnemen met Sebastian – de andere admin van het Launchpad-team – en aan hem vragen het te regelen. (Wanneer dat ook niet lukt is er altijd nog de mogelijkheid om te vragen aan de Launchpadbeheerders om het in de database aan te passen.)

Er is geen reden voor paniek of onrust. Het is vervelend wat er gebeurd is en dat alles zo lang moest duren, maar er is licht aan het eind van de tunnel. Nog even en we kunnen weer opnieuw beginnen, en ik hoop dat we nog veel positief nieuws van Ubunt NL gaan horen. De schouders eronder mensen, we hebben al bewezen dat we iets moois kunnen, laten we dat voortzetten!

]]> 2
Ubuntu Developer Summit Natty, Tuesday and Wednesday Thu, 28 Oct 2010 04:15:51 +0000 Because I was very busy with other stuff—writing down the proceedings of two sessions, and some boring social stuff—I didn’t have time to write the blog entry for Tuesday yesterday. So I’m covering both Tuesday and Wednesday in one post.


The first session I attended Tuesday was the Community Roundtable, in which we discussed Team Reports and Planet Ubuntu, amongst some other things. It was said that for some it is not very clear how to add the Team Reports to the wiki in the correct fashion. To solve this the documentation will be improved, and we also may see dedicated website for submitting Team Reports. I remarked that another language is another barrier; in Ubuntu NL the most important argument against writing Team Reports was that we thought we had to write them in English. However, I have been told that it is not required to publish the Team Reports in English. This will make it a lot easier for us to write down our activities.

We also talked about Planet Ubuntu. A number of people, including me, expressed the wish that all posts on Planet Ubuntu get a name attached. It will be looked into how to make it possible to also make the name of the authors show up next to aggregated posts from the Canonical Design Team. It is also very possible that we will move away from PlanetPlanet, since that is an orphaned project, and even the original author has told people to move to something else.

The Indicator Datetime and Indicator Session were very short; it is mostly bugs that need to be fixed and polishing that needs to be done. Also, Ted Gould likes Texas so much because there are no naturally occurring lakes, save for the one that was created when a beaver built a dam somewhere.

During the plenaries I spend some time trying to figure out what is going on with this Packard Bell EasyNote MZ35 laptop. I’ve got this annoying bug that makes it that every time I am using wireless on this crappy RaLink RT61—the ‘rt61pci’ kernel module has been causing a lot of problems on this laptop and is probably to blame for this as well—and am not connected to a power socket, the laptop freezes.

In the ‘Encouraging game development on Ubuntu’ session Rick Spencer demonstrated his awesome work on creating a PyGame template for Quickly. Currently the command ‘quickly create ubuntu-pygame foo’ creates an actual working basic game, a nice start if you want to create a simple game. I didn’t follow the rest of this session a lot, so I can’t say much more.

Next was a really great session about the One Hundred Paper Cuts project. We should be really thankful to the wonderful, wonderful work Vish has done for this project last cycle. Without his tremendous efforts we wouldn’t have got nearly as many paper cuts solved as we did for Maverick. For the Natty cycle we want to make sure he gets some help. At least I plan on doing more work here. We are making some changes to the way we work on improving the package descriptions and are planning some other improvements to the project as well. More of that later, as I think this deserves a blog post of its own. One thing we will change I can say is that we will allow paper cuts to be reported against more applications, even some that are not in the default installation.

The last session of Tuesday was about the Adopt-an-Upstream project. This pet-project of Jorge will become more visible as we plan to make more people explicit Upstream Contacts who are currently already doing similar tasks. More of this will probably be announced in a blog post that Jorge will most likely write about this. (Right, Jorge?)


The first session of the day I attended was more of an informative session explaining the different technologies on which Unity is built. Something new I learned is that Unity will be a plugin of Compiz, just like Scale or Expo.

The ‘Launchpad for Upstreams’ session was a a really useful one, mostly because it allowed different ‘stakeholders’ to share their thoughts on Launchpad’s behaviour regarding upstreams and propose some optimisations. An exiting feature that was announced during this session was something that the developers have been working on for the last few months: we will see options on Launchpad to set the mail notification level in bugs and the possibility to subscribe to bug searches soon. Great news!

Here I would have talked about the plenaries of today, and the other sessions of Wednesday. However, because I am very tired, it is very late already and there weren’t many very important sessions for me I am skipping this. Apologises in case you were interested.

The last session of the day was about the Bug Squad’s documentation. Currently there is way too much text and there are way too many pages. This needs to change. It will be a lot of work, and this session as only a first discussion. We will probably see all current documentation moved to some archive, and then we will start the documentation from the ground up, moving over the useful pieces of text and designing an understandable and maintainable documentation structure. Stay tuned for more.

See you tomorrow.

]]> 2
Ubuntu Developer Summit Natty, Monday Tue, 26 Oct 2010 02:05:10 +0000 I’m writing this blog post in a chair in the ‘Grand Caribe Convention Center’, at the end of the first day of the Ubuntu Developer Summit in preparation of the 11.04 Natty Narwhal release. It’s been a very interesting first day to say the least.

The 'Afsluitdijk' as seen from the passenger seat

The TomTom is for the time estimate, I swear!

I arrived in Orlando at Saturday evening, after a nine hours and 40 minutes flight from Amsterdam, which was preceded by a two hour car drive to Schiphol. My flight was delayed by one our, but I didn’t have the bad luck of the poor people arriving from Poland who had to circle for two hours in the air above Frankfurt because of heavy mist.

The road to the factory outlet shops nearby, next to a motorway.

On our way to cheap shopping!

Sunday morning we went with a couple of people to a factory outlet store group nearby. It was at walking distance, but we did have to risk our lives by crossing the busy motorway next to the hotel, and the 30° C temperature made things a bit more sweaty than it would have been in the 8° C there is at home right now. In the evening we took the Mears buses to ‘Downtown Disney’, for dinner. There were three buses, three drivers and three guides for the four of us who showed up. Most people where at the mandatory Canonical-only keynote, or had yet to arrive. If you’re at the Ubuntu Developer Summit, please be aware of the buses Canonical arranged to bring you to either ‘Downtown Disney’, ‘Universal City Walk’ or ‘Point Orlando’ in the evenings!

This morning opened with the kick-off keynote, started by Jono Bacon and finished by Mark Shuttleworth. Jono explained how the UDS works, Mark introduced the plans for Natty Narwhal. I could talk about the funny routine with the fly in the ice cream (morale: one bug can spoil it all) or the ‘cadence, design, quality’ focus line, but of course there is only one thing that everybody is talking about. That is this: for 11.04 Unity will replace GNOME Panel as Ubuntu’s shell for GNOME on the desktop as well as on notebooks.

I’m sure many people will be shocked by this, but it comes as no surprise to me. As ArsTechnica pointed out in their article covering the announcement of this morning, RedHat—which appears to exercising complete control over GNOME Shell development—and Novell are mostly focussed on the enterprise desktop, which is completely different from the end-user’s desktop Ubuntu is focussing on. Take that and the fact that it has been frustratingly hard for Ubuntu Community members and Canonical to contribute to GNOME, it is no surprise that eventually there is some divergence.

Exciting times are ahead, and it will not be easy to make Unity as good as it needs to be, but I have full confidence that we will have delivered a great product when the next LTS release is there!

The sessions I attended today were the Desktop Team Roundtable, the Ayatana Roundtable, App Review Board Review, Ubuntu Development Advocacy and Quickly Widgets.

The Desktop Roundtable made it clear that this will be a very interesting UDS for the Desktop Team. The fallback in case Unity is not supported by the hardware needs to be planned, and there is GObject Introspection, and the default application selection that will be discussed.

On the Ayatana front we will see a merge of the GtkMenu processing code of AppMenu and Application Indicators. This should help a lot to solve bugs that are currently plaguing exotic menu items and mundane submenus in Application Indicators.

During the Quickly Widgets session Rick Spencer gave a nice overview of the great number of widgets that are already available from this neat little project he has been running in his spare time. We probably won’t see Quickly Widgets moving to a full-blown library in the near future, but there will be improvements and exciting new widgets. Rick has been working on a webcam widget, and we briefly discussed video/audio-player widgets as well.

Today’s plenaries were interesting as well. The nicest was Rick Spencer’s plenary on ‘gifting’, in which he reflected on contributing, giving something to someone else, without gaining direct benefit from it for yourself. We should keep in mind that everything resolves around the user: we give to the regular user, not so much to other developers.

The most interesting two plenary sessions were the last two by Matthew Paul Thomas and Evan Dandrea, on the applications available for Ubuntu. Statistics were shown of the number of applications available for Ubuntu, which had grown to about 3000 (*.desktop files) in six years. To show the contrast with other applications, statistics of Android and iOS were shown. Android now has 100,000 applications available in its App Store, iOS 300,000. Both operating systems are younger than Ubuntu’s six years. Clearly, something’s missing.

There is a bottle-neck: Debian and we, the distribution, package most, if not all of the software available on Ubuntu. It takes long for all applications to get updated, some get neglected, a lot of software is never packaged and developers cannot present software to the users as they want it, it has to go via a proxy: the distribution. This does not scale, hence it would be much better (at least according to my personal opinion) to make the developers themselves responsible for packaging and delivering the application to the users. The figure Evan Dandrea showed in his presentation was a circle—instead of DEVELOPER->UBUNTU->USER—with Ubuntu being an upstream for the developers, providing the platform and feedback from the users. The users give Ubuntu the feedback, and Ubuntu provides the channel (Software Centre) for the software to arrive at the user’s doorstep, but the developers themselves would do the packaging.

The hall of the conference centre at the end of the dayOf course this would require a lot of automation, but it is something that the App Review Board is very busy with at the moment. We’ve got Mago to do automated testing and there is work going on to write a Python framework for automated  testing. Ubuntu is heading in the right direction!

See you all tomorrow!

]]> 9
Summer 2010: away on a holiday Thu, 05 Aug 2010 17:32:10 +0000 Tomorrow morning at 6.00 CEST we’ll be leaving for Dunkerque, where we will embark for England. We’ll be staying for two weeks in Tiptree, located in the borough of Colchester.

In my luggage I didn’t make room for any laptop or netbook, so for the duration of the two weeks I will be offline. This will severely limit my capabilities to respond to email, comments and IRC conversations.

The UDStream-based WordPress plugin I’m developing, Sociaal, will not be updated during those two weeks, despite its lack of doing the tricks it says it learned. Updates to fix this will arrive after I return.

My partial fix for the Application Indicators bug I was working on will probably be finished by someone else.

If there is anything else: feel free to mail me, but from Friday 6 August up to and including Friday 20 August I will be unavailable.

]]> 1
Just 1%? That’s a challenge! Mon, 02 Aug 2010 17:24:26 +0000 Assuming everyone reading this blog post heard about the flame-wars that raged through the communities after Dave Neary’s talk revealed Canonical is only contributing 1% of the commits to the GNOME project, I’m not going to fuel it by linking to it.

However, I would like to take the opportunity to try to turn this into something positive. Because, 1% guys, we know we can do better! Don’t see this debate about Canonical’s contributions to GNOME as an attack on the Ubuntu project, but instead as a challenge!

Raise that number! Show your appreciation for GNOME by contributing to it. We all know the One Hundred Paper Cuts project; work on solutions for the bugs reported there and forward the the patches upstream! Go to GNOME Live! and browse the project pages for ways to contribute. For I believe that the best way to deal with criticism is not attacking the critic with starting a flame-war, but dealing with the problem pointed at.

The challenge: just 1% of the commits are from Canonical. The solution: contribute!

It seems from Dave Neary’s slides that addresses are also attributed to Canonical (correct me if I’m wrong!), so it is not limited to Canonical employees.

]]> 15