ODF Interoperability: Brussels ODF Plugfest, 19-20 April 2012
The eight ODF Plugfest will take place on April 19th and 20th 2012 in Brussels (Belgium). (More …)
The eight ODF Plugfest will take place on April 19th and 20th 2012 in Brussels (Belgium). (More …)
Two days ago in Orvieto we announced that in two weeks the open source and open standards world will converge in Orvieto (Italy). The second ODF Plugfest (2-3 Nov) will open the week, bringing together implementors and stakeholders from all over the world to work together on ODF interoperability.
The OpenOffice.org 2009 conference (4-6 Nov) – organized by PLIO association and Orvieto LUG – has a very rich program, including specific sessions aimed at schools and at OOo development.
I’m proud to inform that the ODF PlugFest is ynder the sponsorship of the “Consiglio Regionale dell’Umbria” and the “Regione dell’Umbria”.
Osvaldo Gervasi
President of the Umbria’s Open Source Conference Center
Osvaldo that’s a very good news, indeed!
Umbria region is really at the forefront when it comes to open source and open standards. Let me know if you think anyone from the Region or the Regional Counsil is willing to give a speech at the ODF Plugfest on day one (2 Nov) to tell the audience about Umbria’s approach to open standards and ODF.
Thank you for your help!
The Dutch government is showing the way to go: the Minister for Foreigng Trade Frank Heemskerk opened the now famous ODF Plugfest saying that a joint course of action for developing effective ODF support in each other’s products is needed.
Last week the Italian government announced a joint effort with Sun Microsystems to foster the use of StarOffice by local public administrations, but the press release doesn’t mention either ODF or open standards. Improvements are needed, the lack of a European coherent strategic vision towards standards’ compliance.
Rajiv Shah and Jay Kesan wrote the paper “running code as part of an open standards policy” arguing that the “running code” requirement – i.e. multiple independent, interoperable implementations of an open standard – should be part of governments’ open standards policies.
Last week the Dutch government hosted the first ODF plugfest: creators, implementors and end-users met up to improve OpenDocument interoperability for real, and it worked out well.
Hi Roberto,
about OOoCon, good idea! We are going to arrange a Document Interoperability Day where to invite all stake holders to discuss about the matter. You proposal fits perfectly! Would you like to contribute to the organization?
Yes Davide, I am happy to take the challenge, and make it a pragmatic event, towards real interoperability. I’ll come up with a proposal soon, as promised.
The Italian OpenOffice.org Association (PLIO), welcomes the fact that Microsoft Office 2007 now natively support the Open Document Format, the file format for electronic office documents originally developed by Sun within OpenOffice.org and now ISO standard.
The association one year ago welcomed Microsoft’s decision to support ODF.
Something that should get more attention is that OpenFormula, which defines spreadsheet formulas in ODF, is still incomplete after 3 1/2 years. There hasn’t even been any status updates since late 2006.
Free Software Foundation Europe one year ago introduced the Document Freedom Day, and also this year a global day for document liberation is scheduled for 25 of March.
The initiative promotes only ODF, other open standards, like the Portable Document Format (PDF), are ignored.
Today the OMAT conference hosted the “Standards and Interoperability“ session, and for the second time this year Adobe, Microsoft and the Italian OpenOffice.org community were discussing about interoperability and conformance.
The first OMAT conference was held just when Microsoft was enjoying OpenXML approval, while today they publicly announced the decision to join the OIC TC, the OASIS ODF Interoperability and Conformance technical committee.
The Italian OpenOffice.org community already welcomed Microsoft’s decision to support ODF. Now that they decided to join the OIC TC accepting Rob Weir invitation, we are looking forward to see IBM and Microsoft cooperate for a real interoperability.
Sun along with IBM announced the availability of the ODF Validator – a tool that validates files against ODF ISO/IEC 26300, ODF v.1.1 and ODF v1.2 – as part of a broader initiative that goes under the name of ODF Toolkit Union.
The ODF Toolkit Union, a new open-source software community project namely aimed at making document software more innovative, versatile and useful for business. At the present stage are available for downloads two SDKs, two tools for processing ODF documents and a conformance tool.
Declarations of conformity to open standards are a self-certification processes, and tools like the ODF Validator (available also in source code form) can help users and consumers to make better informed choices, at least. Public administrations choosing products that are implementing different “flavours” of a standard, can drammatically affect interoperability. That’s why public administrations should ask Standard Setting Organizations to make conformance testing part of the standardization process.
Getting back to the ODF Toolkit Union, as a matter of fact two members don’t make a union. Actually not even the IBM Press room mention the news, despite Michael Karasick, Director of IBM Lotus China Development Labs, gave the announcement at the OOo Conf 2008 in Beijing, spreading also the word about how good is Symphony.
Will the club welcome other vendors?
Now in Malaga is taking place the Open Source World Conference – an international gathering that after five editions is beginning to be considered as a must within the sector.
This afternoon within the open formats session I was supposed to give a speech on standards compliance and the role of European public administrations, a paper accepted by the OSWC Technical Committee. Unfortunately I had to suddenly cancel my trip, so I wish to share some excerpts from the paper.
The meaning of understanding and the importance of open formats. The relationship between information and its representation are not well known, people often think that digital data and the information they represent are (almost) one and the same thing.There is no full awareness that, as well as in order to understand a text written in an ancient language one must first decodify it. At the same extent to use a digitalized information one must first decodify it.
We get some help in fully understanding the issue from Hofstadter, who suggests to distinguish three different levels in a “message”, according to the following definitions: “framework message”, is the message itself, and it is implicitly transmitted by the global exterior aspect of the “information carrier”. Understanding it means understanding the need of a decodification mechanism “external message”, is the information implicitly carried by the configuration of symbols and structures of the message, telling us how to decodify the internal message.Understanding it means being able to build up a proper decodification mechanism of the internal message; “internal message”, is the familiar one, the message that the sender wanted to send us.
In our context the frame work message may be exemplified by a CD carrying a file name “Beethoven Ninth.mp3”. The external message in this case might be made out of the informations that it is possible to obtain by reverse engeneering techniques from the study of other documents expressed in that same digital format. The internal message might consist of the sensations induced by the reproduced execution of the music.The “mechanism” allowing for the possibility to extract the internal message, that is the “information detector“, may be a computer carrying a given software, or any other apparatus (like a music player), but what one really needs is that the implicit information contained in the external message be made explicit by proper and detailed specifications of the format, expressed in natural language and/or by generative grammars or equivalent formalisms.Once the availability of the informations needed to build an information detector has been granted, the internal message will be freely received and understood (in this context it would not be appropriate to try further clarifications of such a naturally subjective and individual notion as that of “understanding”).
What is indeed important for the user is the possibility to detect and extract the information contained in the digital data, expressed in a given format. In order to ensure such possibility, we need open standards.
Today declarations of conformity to a file format standard is a self-certification process. In Europe the CE marking is a mandatory conformity mark on many product groups to indicate conformity with the essential health and safety requirements set out in European Directives. In short while vendors need a CE mark to sell a plug or a toy, software can be sold without any external test house which evaluates the product and its documentation. Since there is no organization that assess standards compliance, users and customers can just rely on implementors’ statements of compliance. This is definitely an open issue on open standards, and Public Administrations can play an important role, participating to Standard Setting Organizations works, as well as influencing the European market.
It is common to think of standardization as the process of standards creation, but this view excludes those who implement the standard, those who use the implementations of the standard and what happens when standards evolve or get obsolete.As a matter of fact today every IT vendor can tell its products are compliant to a specific open standard, but it doesn’t need to proof it. If European Public Administrations won’t take part in the process to certify standards’ compliance we can easily end up buying and acquiring products that are implementing a different version of a standard, eventually affecting interoperability and communications.
Tomorrow’s data availability depends upon today’s data format.
Reply