Pages

Showing posts with label lisasoft. Show all posts
Showing posts with label lisasoft. Show all posts

Monday, 16 September 2013

Changing Gears

Update: OpenGeo is now Boundless

Time to start up blogging again, after taking a month off to change gears. What does changing gears look like?


I am very excited, and a bit intimidated, by the switch to a product focused company. There is so much going on at OpenGeo Boundless it is going to be a bit of a ramp up for me!

Related links:

Monday, 25 February 2013

OGC and lowering the bar of implementation

I have a couple blog posts up on the LISAsoft website of an OGC nature:

  • OWS 9 and WMS 1.3.0 Adoption: OWS9 is now complete (with an impressive $2.65 million from sponsors answered with $5 million in-kind from industry). For my part I was able to look at Daniel Morissette's excellent advise not to upgrade to WMS 1.3.0 and start attacking the source of the problem - restoring trust in WMS clients.
  • OGC Struggling to Reach out to Implementors: Last week saw a great example of tough love for the GeoPackage standard. Carl Reed is doing the right thing and taking work like OWS Context to the standards@osgeo.org list for review. I had a look and while I am impressed with the level of "reuse" shown, I am concerned with the amount of work (and risk of mistakes) being fostered on client authors.
The common theme here is that OGC takes care of services pretty darn well (this makes sense as it is often what sponsors of events such as OWS9 are willing to pay for). In reaching for a larger audience of client applications the standards body really needs to step up its game and work hard on communication, simplification and lowering the bar of implementation.

I am heading to the AusNZ OGC meeting this week to discuss OWS10 - it will be interesting to see where the priorities are.

Friday, 8 February 2013

LocationTech Announcement

LocationTech has officially gone live, after a good year of planning and ramp-up. Congratulations to all involved. I have a small write up on the LISAsoft website (with links to news articles as I find them).

Monday, 29 October 2012

OSGeo Incubation the Start of Something Spatial

This is my last Latinoware 2012 post (indeed the talk closed out the conference) and arguably the one I am most passionate about. While I am comfortable talking about GeoTools, GeoServer and the ambitious OSGeo Live project they are all concrete projects, you can sing your teeth into and see an obvious benefit.

But what about the Open Source Geospatial Foundation itself? There have been a number of very good takes on the value of Foundations in fostering open source:
We also have both the Eclipse Foundation and Apache Foundation are casting their eye towards engaging spatial. Raising the excellent question (which our board has been wrestling with) on where to take our Foundation in the years ahead.
My personal hopes is we can look for ways to collaborate, as our mandate is to foster open source spatial and not draw lines in the sand. We have a precedent with our projects making use of a range of hosting services, and this flexibility offers a real appeal for projects looking for a home.

OSGeo Incubation - Start of Something Spatial

This presentation is my take on the OSGeo and it is unabashed in its "pro project" viewpoint. The basic structure is taken from the OSGeo incubation checklist, but it is really a shout out to projects about the benefits of joining a Foundation (any a Foundation) and why OSGeo should be on their list.
The presentation runs through the expectations the Foundation for joining projects and gave me a chance to talk about what we are trying to accomplish (mostly with respect to trust).


LocationTech Teaser

As part of the User-friendly Desktop GIS (uDig) project I have been watching the LocationTech industry working group set up, and was able to offer a small teaser in anticipation of the group making
its formal debut. 

Latinoware 2012


Thanks for keeping pace with this week of Latinoware 2012 blog posts, normal service will resume shortly:
Once again thanks Latinoware 2012 for having me, and to Rafael Soto for facilitating. I would really like to see a FOSS4G held in the Brasil, especially given the excellent oragisation (and enthusiasm) shown at this event. 

Also thanks to my employer LISAsoft for giving me a a chance to write up my experience. If you are looking into open source spatial, give us a shout.

Saturday, 27 October 2012

QuantumGIS and GeoServer workshops at Latinoware

First up latinoware was very well attended (you can see a slice of a group shot below).

From Latinoware 2012

While I have attended a couple of Open Source Developer conferences as a LISAsoft employee in Australia, they tend to just walk the software side of the street.

Latinoware had a great mix of hardware and software hackers. I got to see a 3D printer, with the cost brought down to around $300 to the level where it can be successful in emerging economies. I understand it was using recycled soda bottles for material.

From Latinoware 2012

Another impressive showing was a one laptop per child kitted out with a robot accessory so it could walk around. Literally "logo to go" for the educational system - combing ing the joy of learning program with the ability to stomp around.

From Latinoware 2012

Most of the presentations were packed, and were not afraid to take on deep or scary topics (ie how to build your own bootloader thanks to the google chromebook team). Here is a group with TCP stack frames on screen.

From Latinoware 2012

The workshops at Latinoware 2012 were different from what I expect at a FOSS4G event. We have been scared off running workshops concurrent with the regular program at FOSS4G (as attendees would often ditch presentations for the hands on experience). For Latinoware I understand demographics lean toward a student turn out, leaving the pay-to-play workshops with sparse attendance.

Introduction to Geoserver


I managed to sneak into the GeoServer workshop and take a couple pictures, the workshop was being run by Benicio Ribeiro and Rafael Soto.





Quantum GIS


There was also a Quantum GIS workshop hosted by Luis Lopes. If you peak in the background you can see qgiscloud.com on screen as the workshop gets underway.





It was great to see OSGeo projects represented on the ground at such a varied event.

Friday, 26 October 2012

OSGeo Live Latinoware 2012

Moving on from a couple project specific presentations we get onto the hard stuff. Presenting OSGeo-Live is always a challenge - simply because the project is so big! It is also a careful balance between presenting as a LISAsoft employee, and a member of of several of the projects and capturing the vendor/product neutral tone befitting an OSGeo Foundation presentation.

A couple of approaches form past presentations:
  • FOSS4G 2010: This is the original OSGeo-Live talk - with a 90 minute time slot. That was enough time to introduce the OSGeo live project, explore its history as a LISAsoft project gradually opening up to be community driven. Go through different "GIS" software categories - even in an expert crowd such as FOSS4G not everyone will be familiar with disaster relief, or navigation as a software category. Wrapping up with exploring how to use the OSGeo-Live project for education and advocacy.
  • OSGeo Live Presentation: Cameron Shorter cut down the above presentation with a stronger focus strictly on project overviews, and provided a talking script for others to run the presentation at their own events.

OSGeo-Live at Latinoware 2012


Here are the slides, read below to learn why they are really not the point of the presentation.


I was really happy with the pacing, an hour was just enough time to run this presentation. I tend to run my presentations at two levels (as advocated by Martin Fowler). The general approach is to have two streams of communication, what you are saying and a separate visual stream that supports what you are talking about.

The OSGeo slides are basically a solid wall of text, so you may wonder what on earth I would talk about? I am certainly not mean enough just to read slides.

And the answer depends on what the target audience wants to hear. For Latinoware 2012 I was faced with a developer heavy crowd (I asked them before the talk started) who did not have a background in GIS. As such most of the project details, feature lists and so on would be a meaningless snooze fest.


In introducing OSGeo-Live it is important to establish the size (massive) and scope (international) of the project. So while there is no way to read the individual contributors they can lend weight (of numbers) to the introduction.

Of course I did need to answer the obvious technical questions at a Linux heavy conference (yes it uses XUnbuntu) the important point is what the project is useful for, backed up with a successful history of releases and events.


In this context I needed to introduce what mapping and GIS was all about. As such I leaned heavily on the software categories used to organise the OSGeo-Live project. While my bullet points were mostly useless, they gave me something to show when setting expectations for the available software.


With the sheer number of OSGeo-Live projects (50!) and only an hour to present I was left with some tough choices. The individual projects still provide value to the presentation, if only by weight of numbers, but I would not be able to communicate a meaningful difference in the available time. I decided to use the OSGeo nature of this presentation to my advantage, and only provide a discussion on the much smaller number of official OSGeo projects. This gave me a chance to explore some projects in more detail and give a flavour of what the particular software category was all about.


For each of the OSGeo projects I was often able to tell a story, reflecting back on the Foundation, the project community or both:
  • For deegree I was able to talk about the importance of standards, and their involvement with INSPIRE
  • Perhaps the nicest story was for QuantumGIS where I was able to point out the OSGeo Chapters acting as project sponsors to fund specific features. This is an excellent engagement model for the Foundation and shows an amazing community
  • Speaking of engagement models, for OpenLayers I was able to talk about the recent OpenLayers 3 funding push
  • For Geomajas I was able to describe the migration from javascript to GWT (even just for technical interest).
  • For GEOS I was able to talk about what a Geometry is and what developers need to get going in this area
  • I did take some time out to talk about our criss management category, especially given the recent floods in the area.

Note this is just the approach I used for a developer focused, non GIS conference. As illustrated by FOSS4G 2010 and Cameron's presentation this material can be thrown around for an experienced GIS crowd as well.

OSGeo-Live for Advocacy and Marketing


In terms of Advocacy we need to reach back, before the OGC standards, and introduce basic GIS concepts. A point I specifically had to make to the developer heavy crowd at FOSS4G was that maps are fun and important.

In order to play in this area developers need to select both a representation of geometry, and a technique for referencing that geometry onto a real-world location. While I was able to use the GEOS project to tell this story, I keenly felt the lack of JTS as a talking point.

Suggestions: Include JTS on the OSGeo-Live project overview. I was able to write up a jts_overview page on the airplane back, and this mornings OSGeo-Live meeting is looking to schedule including the visual test-runner program pictured below.



Suggestion: The project category slides from the above presentation are a nice minimum that could be added to the OSGeo-Live documentation. Beyond that it would be nice connect with an education sponsor and arrange a couple of "intro to GIS" tracks that can be run directly out of OSGeo Live. This would be a similar collaboration to how OGC standards are described, however a strong education partner is needed to gather the basic explanation angle that projects continually fail to deliver.

On a project by project basis there are a couple important aspects that are missing from the OSGeo live story.

  • Love: Tell me about the volunteers that are behind a project. In a large part the community is what your are buying into when choosing a project. There is a vast difference between a popular project like QuantumGIS (most downloads from OSGeo servers) and specific tools as the TEAM Engine (which just joined incubation).
  • Money Tell me who is paying. It is both polite to thank sponsors for their contribution, and am important information for those considering a project. Even if a project does not meet your feature checklist out of the gates, if the project is popular in your industry it may be a good fit (as its future direction should align with your long term needs).

We also need to keep in mind that the OSGeo Foundation (and thus OSGeo-Live) needs to remain vendor and project neutral. OSGeo-Live is a tool to promoting the grand adventure that is open source spatial software.

Suggestion: List the the PSC members, followed by organisation. This is done for Eclipse Foundation projects and provides a good indication of who is involved, and who is paying their way.

Suggestion: Not quite sure how to thank sponsors tastefully in the context of OSGeo Live. The State of GeoServer talk also does an excellent job of connecting sponsor logos next to the feature they contributed to (thanks to Justin and Andrea for this classy move). This is an important story to tell and one OSGeo-Live needs help with. Idea: Ask each project overview to include a "sponsors" link much like is done for "support".

Thanks


Thanks to Brian Hamlin for reviewing the slides prior to wider internet distribution. A consequence of gathering material from previous presentations was working with an out of date contributors and sponsors list.

Thanks to Latinoware 2012 for having me, and to Rafael Soto for facilitating (and being such as champion for OSGeo in Latin America).

From Latinoware 2012

From Latinoware 2012

State of GeoServer 2012

And now for the Latinoware 2012 presentation people actually came to see - the State of GeoServer 2012. Once again the content is CC by Attribution and build on earlier talks.


The talk raised a lot of questions, both directly after the talk and in the breaks between presentations. The questions all came down to catalog service web support - and what it means for GeoServer 2.3.

I also got to play the careful balancing act between representing GeoServer (as PSC) and representing the Open Source Geospatial Foundation (ie product neutral). There is still a lot of confusion around how to evaluate and select open source projects for an successful engagement.

The most appropriate course of action is to engage with the local community - a major strength of the OSGeo Chapter setup.


Once again the important status update is the release of GeoServer 2.2. This release was a long time coming and improves major headline features, and important changes under the hood.

Recent activities covered by the presentation:
  • Time Boxed Release once again this a deep change that will effect the developer community, our customers and how end users of the application work with GeoServer.
  • Catalog Service a very simple catalog service, used to publish the GeoServer contents out via CSW. This should allow for easy harvesting by full featured products such as GeoNetwork. The initial service is working, but I expect more funding will be required based on the enthusiasm shown online and at events like Latnioware.
  • Sensor Observation Service currently under construction
  • OSGeo Incubation

From Latinoware 2012

State of GeoTools 2012

Here is the State of GeoTools 2012 talk presented at Latinoware 2012. My first speaking slot was devoted to a "State of Geoserver" and "State of GeoTools status update. The slides are CC by Attribution and build on earlier talks.


The talk was well received, it is however pretty brutal to start off your conference engagement with a set of technical talks. I got the feedback that all my jokes were funny which is always nice and hopefully softened the content a bit for those present.


The big status news for GeoTools is of course the release of GeoTools 8 with all its headline features.

It is worth pointing out a few recent developments covered by the presentation:
  • Process, Process, ProcessWeb Processing Service is finally attracting funding, with it comes a lot of new process ideas, implementations and directions. Hold on it is going to be a wild ride!
  • Prep for Java 7 try-with-resource Update our API to mark which items are “Closable”
  • FeatureCollection as a Result Set For Java 5 we needed to prevent FeatureCollection extending java.util.Collections - so that iterators could be closed.
    We are completing this work by removing the deprecated method names (add, remove, etc...)
    This will allow FeatureCollection to be a simple result set.

And the change that is likely to have the most lasting effect: switching to predictable release cycles. This is already being noticed with GeoTools 8.1 and 8.2 being released in September.

From Latinoware 2012

Sunday, 8 January 2012

WPS Personality - Chris Tweedie

One of the pleasures of taking in the Spatial@Gov conference was a chance to catch up with Chris Tweedie. I first met Chris when he was an employee for Landgate over in Western Australia and he served as quite an advocate for open standards and open source. Chris has been picked up by ERDAS and is thus emphasising the open standards half of that equation.
Chris Tweedy
Chris was kind enough to demonstrate the ERDAS support for web processing service. As always I am impressed with the ease of use provided by integrated solutions.

The ERDAS browser client was impressive and easy to use; although Chris does not normally demo this product we were quickly able to figure out what the screens were asking of us and collect "Elevation Change Defection" results back for display.

When we got down to technical details ERDAS had reached the limits of the WPS specification. The WPS DescribeProcess data structure does not supply quite enough information for their user interface needs (example field validation).  I hope the future versions of WPS will be more helpful in this regard.

That said they were not limited to their WPS server; and had performed testing with either 52N or deegree (sorry I cannot remember which as they did not have one in their booth to test against).

I also recognised an old friend in the NDVI vegitation model, a classic we had slated for last years WPS shootout.

Aside: Thanks to LISAsoft for sending me to Spatial@Gov it was a fascinating look at the Australian Geospatial scene.

Wednesday, 16 November 2011

Spatial at Gov

Spatial@Gov is the second conference I am taking part in this week.

Kudos to those who sent lunch packages around to those manning the booths (i.e. me). It was a really nice touch that I have not seen done at any other venue. Classy all around.

Please drop by the LISAsoft booth and say hi.

ESRI Australia Know your Place

Okay I always enjoy thanking sponsors; and ESRI Australia is a sponsor of Spatial@gov. They get extra points for providing free coffee (thanks!).

But I really must call them out on their banner this year:

Really? You sure you want to say that? In australia?

At many levels it is perfect; a dispassionate fellow stairs out over the head of the plebes passing by...

If you are at Spatial@Gov today; drop by the LISAsoft booth and say hi; we are friendly and inviting; and have an alternative scenario for you to consider.

Cameron has also been kind enough to arrange an OSGeo Aust-NZ breakfast meet up. Join the revolution - support OSGeo.

Open Source Developer Conference

I get a chance to compare two distinct conferences today. The first one is Open Source Developers Conference - which has been an amazing source of conversation ideas and energy. Organisation has been top notch with a real attention to details (the conferences passes include the program; so people avoid having to juggle bits of paper when checking where to go next).

LISAsoft is currently hiring; so I am supposed to be on the look out for new talent. I am afraid I am being distracted by all the fascinating and creative work.

Sunday, 4 September 2011

jpeg vs geotiff speed

A really common question is with respect to jpeg performance being slow.

JPEG is a compression format based on how the human eye works; the eye is very good at seeing a color on one side of a "sky" and another color on the other side of the sky and filling in a nice smooth gradient in your mind. Even when the actual data shown on the monitor does not actually have a smooth gradient.

The jpeg standard uses other silly human tricks about how the eye works in order to throw out information that is not needed because your mind will fill in the gaps giving you the same experience.

To do this it needs the *entire* image; and it also needs the entire image show to your eye.

So it is really not very suitable for GIS use where we expect the image to reflect measurements.

Formats like geotiff and ecw are organised to be read off disk; so depending on where in the image you are they can calculate what area of the file to read; and thus display part of the image without having to read the whole thing.

Even when zoomed out; because the file structure is organised the readers can sample the pixels to pull out just enough information for what is on screen and no more.

For GeoTiff they can even go beyond this and have an overlay of the file which we can display when JMapPane is zoomed out, or internally structure the file with "tiles" for even better performance (less disk seeking) when zoomed in.


There is some good background information on these kinds of topics  from the LISAsoft 2009 making maps fast workshop:
- http://download.osgeo.org/osgeo/foss4g/2009/SPREP/0Tue/Parkside%20GO2/0900/

Sunday, 31 July 2011

Social Programming - Open Source as the original social network

Going to recycle a Google+ post; which has been taking shape with a bit of feedback. Comparing and contrasting Social Media vs traditional Open Source communication and some ideas on the future of open source in the face of all this social stuff.

Social Media

Google Plus

Okay so I have way too many social media things. So far google plus is mostly about people learning the ropes and comments how they plan to use things (just like this post!).

For the short term I am going to make useful circles; and treat them like small email lists. It is kind of a feature that gmail lacked; and since google plus offers to send email it will at least give me a reason to try this one out.

It also looks like I can use it as a setting to interactively create a post such as this one; update the content based on feedback before punting it out as a blog post.

Facebook

Long term I would (like everyone else) like to see Google Plus cut down my use of facebook. There is a nice "Geek and Poke" cartoon that sums it up the best: As with all good cartoons of this nature it funny because it is true. In the 90s the Internet really was Internet Explorer (on the desktop); in 2011 the Internet really is the "F" (on desktop/phone/tv/etc...).

I don't really find this scary as the most popular application to use the web changes over time; we had email, IRC, websites (vs news feeds), blogs and it is about time we had a break from Google "being" the Internet.

Since my professional contacts started hunting me down on facebook it has mostly curtailed its use as a tool to catch up with friends.

LinkedIn

Speaking of professional contacts ... I also logged into LinkedIn recently (as with any social site I created an account years ago in order to avoid the constant invitations).

The result is I actually like their service enough to install a phone application. Mostly it is serving to remind me how global our "open source" professional is. I don't really see google plus treading on this one as it is behaving a bit more like an online resume / recommendation site and a good way to check or confirm professional background.

Twitter

Twitter is an interesting one; I have tried following a couple popular people on google plus - something I never tried on twitter (and I am not sure I will continue it on google plus).

So far google lacks the all important "gossip" vibe of twitter - mostly this shows up when I tweet about a new geotools or uDig feature; and then for the next several days I can watch that news trickle out as different people repost. It is an interesting way to gauge interest; and a reminder that much of what is needed for open source is work (and not interesting enough to retweet).

Traditional Open Source Communication

For open source there are number of other "social network" tools we bring to bear as this is a public activity:

IRC

The use of IRC is still steady for many of the open source developers I work with; however I am increasingly seeing twitter take over that role for the new younger generation. It is often the only way to know what someone is up to.

I am going to list Google Talk, Yahoo Chat, Facebook chat and basically anything else I can register with iChat in this category.

Special mention is made for Skype which with the ability to share a desktop session one of the most effective tools for "code review", or a "breakout session" on a specific bug. We do a lot of work with uDig this way and it is very effective - especially for cross platform issues.

Issue Tracker

Depending on the issue tracker used this can really take on a social network vibe. JIRA offers the GeoTools project; comments, collaboration, tracking, the ability for users to "Watch" a conversation; or "vote" an issue to greater attention, the ability to capture screen snaps, logs, files - not to mention the joy and frustration of bug fixing.

In a sense issue trackers are the only social that matters to an open source project; it has evolved over time to focus on the social interactions required to get the software patched and releases made on time.

Developer Email List

Email list use has be on a steady decline as a lot of development conversation moves over to the bug trackers. Still get a little bit of organisation, planning and design discussion.

User Email List

The user email list is still popular; but is being threatened by Q&A sites such as stack exchange and others. There are several key advantages, the most attractive one from a project standpoint is not having to answer the same question multiple times. The system where answers are voted to the top also is more helpful then expecting people to sift through email archives.

Update: Whatnick - points out that email lists can sometimes be used in a forum mode (as per Nabble or similar). I don't find this a separate communication medium than a "user email list" and indeed in the past I found "forum" use actively harmful in that they encourage users to browse; rather than sign up for the email list and take part in the community.

I do remember the only way to understand Eclipse in 2004 was to sign up for the news group.

Wiki

Wiki is really the dawn of Web 2.0; originally from the portland pattern repository etc.

Wiki has been pretty much killed off due to vandals; with news announcements migrating to blogs; and design discussion migrating JIRA, user discussion moving to Q&A the general purpose wiki is on the way out. The only thing GeoTools is using its wiki for today is the RFC procedure and a staging area for design notes.

Blogging

Blogging is an interesting one; it has succeeded in gobbling up a lot of what I used to capture in a wiki (announcements, informal chatter and ideas). A lot of the benefit comes from planet.osgeo.org and others extending the reach of my personal blog.

The ability to follow blogs; watch RSS feeds get manipulated and chopped up for re-consumption is otherwise not widely used. Usually for a blog post to work you need to "time it right" so it shows up on a website such as Java.net; or more importantly watch it be voted up in Dzone; or simply collect google web hits based on the post having actual content (my preferred technique).

In terms of staging a conversation comments have almost universally been turned off; the conversation has diverged to places where the people are (DZone, Twitter, etc...).

The promise of the blogosphere is rarely met - with great public Quarrels (in the traditional War of the Roses sense) happen only rarely on grand topics. Most recently the future / currency of OpenLayers (results in a flurry of half a dozen amusing blog posts answering each other from different points of view).

I am lumping RSS feeds into this category; as a blog almost a source of RSS feeds. Best consumed via google search; or via a custom reader app such as a "Reeder" while no the bus.

The Future of Social Programming

The Integration is Now

The most common thing to do after making a blog post; is to Tweet the link to those who follow you. Kind of like an RSS feed that will people actually use.

More recently tweets get automated - many blog posts I publish against watched by simgeo with a bot to tweet as soon as I hit the publish button. Depending on how far that item is retweeted you can gauge how interesting the topic was.

Integrated Social Programming

Can I coin that as a phrase? Kind of like an IDE for the communication half of an open source project.

More seriously GitHub offers an interesting alternative to consider - by setting up an apple like "cultivated garden" with nice code review tools; workflows to facilitate change management etc...

Projects like CodeHaus, Google Code, SourceForge and so on also fall into this category. The gather up many of the traditional communication tools and provide an easy to use front end.

Testing, Testing is this thing on

So what is my expected half life for this post?
  • Google Plus - around a week (as people hit my profile it will be my top post; and people are still adding me)
  • Twitter - days depending on how many people I know rewtweet it
  • Blog - posts tend to lurk on Planet OSGeo for around a week. My posts are usually just facts; so I rarely get an answering blog post on an opinion piece. Indeed it is easier to say something wrong if you want your blog post to last.
  • Facebook - zero (I would not annoy my friends with this post - although they would read the comic)
  • LinkedIn - will pick up my tweet.
  • Buzz - did they bridge that to Google+ yet?

Open Surface and the value of Public API

Microsoft recently posited the idea of "open surface" as an alternative to "open source". Correctly noting that a stable public API, protocol, standards are of great value.

This is actually a position I have great sympathy with:

  • OGC open standards are considerably easier to adopt; then the pay to play ISO ones.
  • Sextante - Is propietary software better for developers? noted that when a public API is the only mechanism for communication with developers that the API / Docs / Code examples are very much improved as a result
  • This is contrasted with "Open Source Code" translating as open the source and read the code - a different value proposition to be sure!
  • API stands for "Application Programming Interface" - and is generally the information needed to use an application or library as a programmer. It documents what can be done on a function by function (or method by method) level. Traditionally this was limited to describing the parameters; inputs and outputs, and for really good docs providing an example use.
Related documentation:
  • GeoTools javadocs
  • uDig - does not publish javadocs (as they are generated as needed in Eclipse and shown as tooltips during development)
  • GeoServer javadocs - I could not find them easily as the website is primarily user focused. As with uDig docs I did not notice they were "missing" as Eclipse generates them on the fly.
I will point that the general approach here is to really lean on tools like Javadocs, or Sphinx that process API documentation that is coded along with the source code.

Sunday, 19 June 2011

How to Buy Open Source

One of the great things about working at LISAsoft is putting together interesting training courses for our customers. Recently LISAsoft has started rounding up some of the best, putting up a schedule, and taking these courses into wider circulation.

As cameron has indicated Australia has joined the ranks of countries explicitly asking their government departments to consider open source in their "purchasing" decisions.

I have been gathering material for a course on this topic; as the kind of selection criteria considered when evaluating open source projects reflects a different set of risks then a procurement department is perhaps used to.

Popularity

A couple of months ago I went over the amount of email some of our open source leaders put out. While this does indicate amazing dedication to our our OSGeo community; it perhaps not the best metric of productivity.

I also have been following the various web processing service implementation; and the amount of email is a real indication of popularity and activity.

(The above graph is just a reflection of activity. Both deegree and GeoServer are blips as not all of the activity reflects "wps" work. Some of the other projects just seem to have one email list for both development and user questions. Projects that successfully turn users into customers will also reduce email volume handling support offline. Figuring out how to deal with all this is one reason why you should take the course!)

Productivity

The other aspect to this is using the open nature of open source to evaluate those offering support level agreements. This is a case were we can really look into the productivity of an organisation and look into their experience and track record.

(The above diagram is for the GeoTools project, issues resolved in the last quarter. GeoTools developers sell very few direct support contracts - instead focusing on the support of downstream products. Making for a nice graph I can safely publish without ruffling any feathers)

This level of visibility, combined with support being an area heated competition, provides a procurement department with more leverage (and responsibility) than simply talking to a sales representative.

Open

Measuring how open a project is actually one of the top mandates of the OSGeo foundation incubation process.

Any project emerging from the incubation process has:

  • A public email list (although sometimes that is not the only email list!) - allowing you to produce graphs such as the above
  • A public issue tracker - allowing you to produce graphs such as the above
  • Open development procedures; allowing you determine how much effort is requried to participate and perform common tasks (such as a release)
  • An open review of the legal status of the codebase - giving you some confidence the code won't be pulled over a legal reason leaving you high and dry
Bringing this level of visibility to a project is one of the reasons I am a fan of the incubation process (although it is quite a burden given recent graduation rates).

There is of course more to measure: recent requests to summarise quality assurance procedures for example. In my experience the "open development procedures" ends up providing a great view on the QA controls for a project. Putting that together into a table for review and comparison is just a matter of research.

Monday, 13 June 2011

OSGeo Stay out of training - the pros and cons of certification

Tyler is doing an excellent job of broaching difficult topics.

The long and short of it is that the foundation is sensibly looking at revenue streams to support its activities. There are lots of great ideas for what the foundation can do; but accomplishing (or even setting goals) requires resources.

If you are interested in the following rants a more diverse opinion is available on discuss@osgeo.org.

Rant Zero: Income

The foundation does need a source of income; it may not be obvious but we do depend on the foundation for a wide range of things from legal advice, to a friendly voice on the phone (that would be Tyler) to the nuts and bolts of keeping the servers warm and the lights on when people visit osgeo.org.

Rant One: Projects

Clarification: Projects are not a source of volunteers (sorry - not even for really good ideas).

Even if an idea is good, it will need to have some volunteers (or budget) attached for it to be successful. The projects are as I understand it already maxed out ... making software.

Indeed I can think of a lot of ways in which projects could be better supported that have not come through ... yet.

  • push to bring more volunteers into the development teams of each project.
  • fund maintenance activities (documentation, technical debt, quality assurance, external security review)

The osgeo discussion list also shows individual projects are also seeking ways to raise funds. With a classic tradeoff between an OSGeo backed 'donate button' (difficult to explain where the money goes) vs the 'bounty model' used to fund the development of specific features (difficult to explain how to include maintenance and documentation let alone the foundation).

Rant Two: Training

Training is one of the few "business models" that almost all the projects share to support the various development teams. I cannot really see a way for the foundation to offer training without conflicting with existing providers.

The best I can come up with is taking away the "provider list" and putting it behind a pay wall. The foundation could send out the latest copy of the list on request; in exchange for a referral fee of some sort.

The danger is the foundation appearing to recommend a provider or training course that ends up being terrible. I cannot think of a cheap way for the foundation to supply any kind of quality assurance of providers, making this a risky approach to raise funds.

  • Quality could be improved by asking the PSC to review providers (in exchange for some of the referral fee). Good way for teams to raise maintenance funds.
  • The foundation could define criteria for evaluation: GeoServer for example organises its "commercial support page" based on participation and provider experience.

Rant three: Certification

Here I am going to cheat and recycle and edit some of my emails:

With respect to Paolo Cavallini commenting about the foundation supporting our work, not competing with it:

While I concur (I don't want to see the foundation set itself up in competition ) there may yet still be a useful roll to play.

What I cannot figure out is how the foundation could expect to make any money from this angle ... any figuring of costs I go through makes it look like a massive effort.

As for the useful role: If OSGeo was able to supply a certification test, provide independent marking, and issue the resulting certification it may actually complement existing training offerings by the "professionals and enterprises" and make training easier to sell. This would both validate the training offered; and act as a competitive advantage - right now given a choice between two training courses people will often choose the option that gives them a chance at sitting a certification at the end (especially if they have a limited budget and don't really care what it is they are learning).

A couple of things are clear to me about this discussion:

  • a) I *hate* certifications; I feel they prey on the disadvantaged of our industry right when they are weakest (this goes for both job hunters and those going through a hiring process)
  • b) certifications are really required in different markets around the world (especially when industry has lost confidence in the meaning of a university degree).
With the above in mind I feel that certifications will happen; and given a choice I would rather it happen at the foundation level (rather than getting people certified in different product stacks).

So while I have some mechanics in mind (certification to include the open source process; not only use; demonstrate ability; aim for a 50% pass rate for the certification to mean something; offer "bulk" discount to groups wishing to use tests at at the end of a training course; or groups wishing to use test as part of a hiring process).

What I cannot figure out is where the profit is; or how to pay for people's involvement. While groups offering training could collaborate (and possible act in a double blind capability to mark results); it would probably require some paid hours to get projects to look at the tests and make sure they mean something at the end of the day.

Pricing the tests would probably be within market norms; and I would expect a much cheaper retry cost (possibly just covering marking time) if we manage to make the marking process brutal enough to be useful to potential employers.

One thing we have a chance to do well here is stress the soft "open source" skills that a potential employee must have in order to be sucessful. Rather than only mechanical questions about configuration and use. Examples: link to 3 questions you have answered on the user list; two issues you have reported etc (which can be marked for completeness etc...).

Finally you have the annoyance for companies that are already established in this space of having the possibility of competing with new groups that have picked up their certifications and appear better "on paper". I cannot honestly have much sympathy here, competition is as competition does, best advice would be to help define the certification (and allow that to be placed on a resume).

To clarify how certification makes the case for QGIS training courses stronger; and does not conflict with existing training offerings:
I don't think anybody is interested in the foundation competing with existing training courses. (Indeed training is one of the few places where any cost recovery on the udig project occurs).

That said if you don't want OSGeo competing in training - how would you like to pay for the foundation? I am not sure if your organisation sponsors OSGeo? I don't think my employer does (preferring to volunteer marketing effort); and I don't personally sponsor the foundation (preferring volunteer effort myself).

So this is the nice part about certification:

  • it would make your training courses stronger (ie more attractive to customers)
  • it makes training an easier thing to sell (take training as one step towards getting ready for certification)
  • it would make QGIS more attractive (as a technology in which certification was available)
  • it provides the foundation with a revenue stream that does not compete with any of the member organisations (Indeed certification is a "service" that very few organisations could offer credibly?)
From the QGIS standpoint the benefit for you really is focused on those first couple of points; certifications would be an additional activity the foundation could perform that would make your training courses more valuable.

My own thoughts on this (using your QGIS project as an example for how certification supports training):

  1. Testing criteria
    • organisations offering QGIS training are asked to supply criteria to use for the certification process (If your organisation wants to be involved this is where you would take part)
    • the foundation pays for someone to write the test material for a specific QGIS release (perhaps you? perhaps another vendor?)
    • the test is passed around to those supplying QGIS certification criteria for review; production of an answer key etc...
  2. Next time you do a training course offer your customer the option of either:
    • a) taking the certification tests at a later date (you can pass on the foundation contact details; and get a 30% cut in thanks for the referral)
    • b) arranging for a "bulk purchase" where you can offer your customer a discount for doing it then and their (perhaps give the customer a 20% discount to make it more attractive). You would need to play with the numbers to make this attractive (so customers don't just ordering the test for their top people).
  3. Each month the foundation hires one of the organisation that defined the testing criteria to mark the tests
    • a) a month is chosen to have enough tests together in one spot to make effective use of time
    • b) the organisation hired should follow a set rotation to be "fair"
    • c) the organisation hired should probably not be responsible for the training of any of the students being marked in order to keep this as independent as possible
  4. Marking should be brutal
    • a) the idea is to force a spread so that potential employers can actually respect the certification
    • b) cover open source activities (bug submission, contribution to documentation, participation on the user list). If it is any kind of advanced certification this goes into building the application from source code, applying a patch and building locally (can submit a screen snap of the result), links to accepted submissions etc...
    • c) How brutal? How about if they get everything right they end up with 90%; the last 10% is there to allow markers to recognise "outstanding"
    • d) if you really want to soften the blow you can provide different levels of certification out of the same test (confusion may not be worth it; easier to fail people and ask them to try again)
  5. Updates to certification should be cheaper and repeatable
    • a) as each release comes out the certification criteria should be updated
    • b) a cheaper rate for "repeat customers" should be available - to encourage this both as a revenue stream - and as a certification process that employers can trust to be update to date. Why hire someone certified in QGIS 1.6 when QGIS 3 has been released?
    • c) the cheaper rate should also be available to those repeating the same test (partly to soften the blow due to the expected failure rate)
    • d) Updates are going to have to occur often to reduce cheating; we have a slight advantage in that we are testing real skills (the test can ask for maps produced with QGIS) and real interaction (the test can ask for links to nabble showing participation).
The other scenario for using the certification tests is:

Next time you hire someone

  • a) Buy a "bulk purchase" of tests
  • b) Ask applicants to take the test; and submit review (this is nice for them because it is on your dime; and nice for you as you get an objective evaluation)
  • c) The foundation arranges for someone to mark this pronto as part of the service; probably only returning details on the top five candidates
  • d) The foundation could change more to access test results in detail

The final touchy subject is "discounts":

  • a) arranging some kind of discount for graduate students (perhaps if their professor helps with the marking it could be arranged at the school level).
  • b) I hate asking graduate students for money; graduate student money is better spent on beer :(
  • c) arrange some kind of discount for "osgeo volunteers" perhaps with an email from a recognised osgeo committee chair (project steering committee, education committee or something). Because I don't mind asking graduate students for volunteer time ...
  • d) Being tough enough not to offer discounts to usual suspects (project developers, osgeo sponsors, people we really like ...). The more discounts that are around the lower the perceived value of the certification; we should try and get people to pay full price once; and then pay to retake the certification (either because they failed or because a new version of QGIS came out).
I am being very strict about not using the word volunteer in the above activities; your company / organisation should be paying for your involvement. And the OSGeo foundation should be hiring your company / organisation to set the certification tests, perform marking etc... This is very much pay to play.

Monday, 30 May 2011

Open Layers 2.10 Beginners Guide

Thanks to Cameron Shorter for putting me in touch with packt publishing so I could review the new Open Layers book. This is a great chance for me to look at Open Layers as it is not something I get to use professionally as I usually live on the other end of the stack.

Summary since you are unlikely to review a full book review:

  • Great introduction to Open Layers by Erik Hazzard
  • Around $45 on Amazon
  • Available from Packt Publishing for a bit cheaper.
  • There is a sample chapter available.
  • James Fee Review
  • Geoweb Guru Review
  • Review - How well does the book address these issues:
    • Typesetting: Paper (8/10), PDF(7/10), ePub (5/10)
    • Documentation: (9/10) - this is the one that matters
    • Usability: (7/10)
    • Examples: (8/10)
    • Explanation of the general architecture: (5/10)

Updated: removed rant about documentation for another post.

Reviewing OpenLayers 2.10 Beginners Guide

Here is a recent tweet from Volker Mische of GeoCouch fame: Why people use web mapping libraries other than #openlayers http://t.co/e4V3VDS #wceu

Problems with Open Layers:

  • Documentation
  • Default Look
  • Usability
  • Examples are not good
  • No explanation of the general architecture
It is indicative that most of the problems above are not technical in nature, but reflect a lack of documentation. Update: Volker has an entire post which covers this list as does crschmidt.

I will use Volker's list to frame this review, how well does this book fill in the gaps mentioned above?

Typesetting

So what is Open Layer 2.10 Beginners Guide like as a book? It is one of the first ebooks i have tried to deal with directly, so the first step is sorting out how to read it in a productive fashion.

End Result, ePub may be the future of publishing, but at least on a Mac the future is out of reach - stick with PDF. To make this setup productive I needed to go grab a second monitor; one for the book and one to work in.

Edit: removed rant about documentation for another post.

Finally I printed out the first section to see how the typesetting comes across in book form.

  • Paper (8/10):

    The typesetting offers attractive with a clean and consistent presentation. The Typesetting features a distinctive look with bold wide headings that act like a black hole on the page (especially when reversed). In physical form this has been offset with nice wide white margins for an attractive balanced presentation.

    Code is presented clearly with line numbers so you can refer to a specific line. There is a missed opportunity here as the code does not have any kind of formatting (for example bolding keywords to enhance readability). While this sounds like a small issue it really starts to add up after a while especially when examples mix strings and javascript code.

    Sections are clearly marked, and importantly consistent allowing you to flip through the content and scan for sections such as "Time for action" that identify step by step instructions.

  • PDF (7/10):

    I have to guess that the eBook is intended as a supplement to a printed copy. There are a couple of things that let this use down: 1) the large margins detract from the on screen reading experience 2) Unfortunately the line numbers on the code examples are picked up when trying cut and paste into a text editor.

  • ePub (5/10):

    I am going to guess that the problems with ePub (on every single reader, and noticed by James Fee) is a quality control issue that the publisher can fix in due time.

Documentation

So does this book address Documentation as a weakness of OpenLayers?

I have two things I expect of a book aimed at beginners:

  • Instant, visual, satisfaction

    Basically make me a believer, or at least believe enough to continue reading.

  • "basic as a cookbook" instruction.

    step by step showing me what to expect as I go.

Open Layers Documentation

For comparison the Open Layers Documentation is mostly devoted to live examples. The major and probably underrated advantage is that these are a) always current b) easy for developers to maintain. There is also a FAQ which looks to be answering questions about the project (who cares?) and the start of Sphinx driven prose documentation which looks promising.

While the examples may be a good reference they really are not suitable for beginner docs. Lets see how the sphinx docs does:

  • Instant, visual, satisfaction? The sphinx docs start out with an Introduction that is perfect in this sense. The very first heading is Creating your First Map

    Total time for visual: 2 mins

    While this may not the best metric for evaluating a book it is at least how I evaluate when in a book store. Can I understand the "hello world", if not I hunt for a simpler alternative.

    Did I understand anything of what went on? Not at all (sigh). Simply adding the empty *script* block so I knew where to cut and paste into would help. Only reasons I was successful at all is because they put the example together at the end.

  • Basic as a cookbook? The sphinx docs fall appart here as they assume I know what I am doing (surprise I don't - where does the script block go again?). The steps were not numbered (helpful when all the concepts are new and you cannot tell the paragraphs apart). And surprisingly for open layers there is no visual included on the page showing what the results should look like (really important for a beginners guide as it is so hard to know if you have completed something correctly).

    To be fair these are probably not beginners docs and the intended target audience probably knows more than me about how things work.

So how does OpenLayers 2.10 Beginners Guide fair:

  • Instant, visual, satisfaction? (7/10)

    Getting something on screen is covered in the first chapter. While we do have to sit through an introduction and background information, we do finally arrive at "Time for Action" section on downloading open layers followed by "creating my first map".

    Total time for visual: 30 mins!

    As for the long time; I was unable to cut and paste from any of the reader formats (downside to those helpful line numbers).

    The following shows how selection works in the PDF:
    And the resulting paste into a text editor:
    map var
    );
    = new OpenLayers.Map('map_element', {}); wms = new OpenLayers.Layer.WMS(
    

    After typing things in managed to produce an empty map! After checking and rechecking for typos; trying in different browsers, I finally clicked on the page which resulted in the map being shown (a ha something is working!).

    A few questions on IRC (thanks crschmidt on IRC) and I was able to sort out my mistake:

            map.addLayer(wms);
            if(!map.getCenter()){
                map.zoomToMaxExtend();
            }
        
    Can you spot it? It should be map.zoomToMapExtent().

    So while I would love to give this one high marks for the clear instruction; it was a frustrating introduction to the book.

  • Basic as a cookbook?This is where the book really hits it out of the park.

    As other reviewers have mentioned, the second chapter on firebug really underlines the fact that this is a book aimed at beginners.

    While intellectually I know that firebug is the standard environment for working with Javascript without going crazy; I only know this from watching coworkers. So I am thankful for the kind introduction using a domain I have some familiarity with.

    In terms of the step by step directions I find Mr Hazzard's writing clear, and more importantly on topic. Any explanation is safely separated into a "what just happened" section so it does not get confused with the step by step instructions.

    The other important aspect of step by step instructions is the pictures allowing you to verify progress and have the all important reassurance that you are on track. The pictures are plentiful but not pretty. While I understand that the result needs to be printed in black and white

    I did suffer a little bit of platform and version burn during the initial install instructions; I wished that the instructions were given once each for (linux, windows and mac) as it is hard to follow a mix of linux and windows directories and then be presented a screen snap of some kind of linux file explorer in order to verify you did the steps correctly.

    Once the book gets underway these platform specific issues vanish. There is a small "bait and switch" with the preface indicating that you can use any browser followed by the second chapter being dedicated to firebug.

Still this an excellent example of how to write documentation for beginners, working with Firebug is really assumed knowledge that anyone calling themselves a Javascript programmer (and thus a candiate for working with OpenLayers) is normally expected to bring to the party.

The rest of the book continues in this fashion offering a nice logical introduction to key concepts.

One shock for me coming from a GeoServer background was how much isolation OpenLayers offers from OGC standards. A good example is Style handling with Symbolizers being introduced directly and no reference made to Style Layer Descriptor standard.

Default Look (9/10)

Next up on Volker's list is the default look of OpenLayers. In this case the book goes way beyond what I was expecting from a simple beginners guide. There is an entire chapter devoted to themes covering how to style the controls use by OpenLayers.

I am not sure what the OpenLayers examples are like in this area; but I expect the book will help websites migrate away from the default looks.

Usability (7/10)

Usability is going to be one of the technical issues on Volker's list. This usually comes down to making a web application both simple and direct to work with, and "snappy".

The book does manage to mention performance in a couple of sections; and emphasises the benefit of a fast web mapping application along with specific guidance on TileSets vs WMS. It also goes through the exercise of cutting your open layers file down to just the parts needed for your web mapping application.

So for a beginners guide this book is aware of performance and usability issues and takes some steps to address these matters.

Examples (8/10)

This is an interesting entry on Volker's list - "The examples are not good". Personally I like the examples on the OpenLayers website, but I agree they should not have to bear the entire responsibility for documentation.

The examples in the book are more in the nature of step by step instructions, while I have a terrible time cutting and pasting out of the PDF, they are effective and plentiful.

One thing I would love to see in both the book and the website is the use of the natural earth dataset (or any other attractive dataset). The default blue and white map has come to define what OpenLayers looks like; and while it may be functional for examples I am sure we could come up with something more exciting.

Explanation of the general architecture (5/10)

This is an interesting topic to consider with respect to a Beginners Book. I certainty feel like I know some of the moving parts in an OpenLayers application, but I expect I need to build a few applications to feel comfortable.

With that in mind this book really does span a broad range. Starting off with some object oriented basics in Chapter 1 (for those new to Javascript). A couple of reference sections for the key concepts of Map, Layer, Vector Layers and Style. With a chapter on Making a Web Mapping Application bringing it all together.

So as a reference the book performs admirably. Does it address Volker's question about the general architecture? Probably not. I would of loved a picture of how open layers fits together to give me a better feel for the general architecture. Indeed I almost expected one after being introduced to Object Oriented concepts in Chapter 1.

Recommendation

I am happy to recommend OpenLayers Beginner's Guide, it offers a nice introduction set at the correct pace for beginners in web mapping. The publisher has an opportunity to fix may of the technical difficulties present with the electronic editions of this book.

Thursday, 16 December 2010

taking shape

My original thought on this blog was that I spend more time updating docs than writing blog posts. So it would be easier to discuss open source fun while linking to docs.

Today I am going through the uDig release process; with the goal of a 1.2.1 release.

This release should be a really good time technically:
- GeoTools 2.7-M4 (released a couple weeks ago for GeoServer)
- Eclipse 3.6.1 serving as the foundation

If you would like to help out with testing a preview of uDig 1.2.1 is available here:
http://udig.refractions.net/download/unstable/

And our notes during testing:
http://udig.refractions.net/confluence/display/UDIG/1.2.1

The other exciting thing about this release is a steady stream of new user visibile features; brought on by the generosity of the udig community.

This was helped along by our move to gitorious.org (and away from svn).  The move from svn was timed perfectly; Refractions was recently unable to restore our old svn repository. We were able to rescue the uDig 1.1.x codebase and it can now also be found on gitorious

As for the original blog idea (and a sneak peak at 1.2.1) here is documentation harmed in the making of this post:
http://udig.refractions.net/confluence/display/EN/Raster+Style+Pages

Sunday, 12 September 2010

OSGeoLive Lightening Overview

I am trying out slideshare as an easy way to share the OSGeoLive Lightening Overview; I will update this post when I find a website to actually store these slides on (any idea if FOSS4G 2010 website will host them?)
I would like to thank everyone who came to FOSS4G this year; especially those who sponsored the event or setup a booth. We spend a lot of time thanking the various development communities, user communities, education community, and conference organisers - all with good reason. A big thank you to the event and project sponsors for facilitating all that we do.