I was at Plone conference last week in Seattle. There is something about rubbing shoulders with hundreds of developers that keeps your energy level up. People were so engrossed in discussions of content management, configuring and installing portals, hacking Python code and it all seemed so natural. The theme of the conference was Content, Collaboration and Community. All were very much in evidence at the event.
Plone is a content management system and an open source portal framework built on top of Zope. It is written in Python and combines a web server, an application server, a built in object database, a powerful search engine, an awesome component framework and is very, very extensible. It provides support for a wide variety of in-built content types, allows you to define your own content types. The latest version, Zope3 is very component oriented. Plone is a content management and portal layer on top of Zope. Plone makes it easy to use Zope and is very flexible with a skinnable interface. The Plone+Zope combination beats any other portal/cms framework in the industry today.
The beauty of this dynamite combo is that you can install and set up a portal in less than an hour. It is an ideal combination of simplicity, flexibility and power. And it is all free.
The Plone community is very close knit and I heard in the conference that all the Plone consultants are busy with more work than they can handle. I had been to a conference a few years ago in New Orleans. This one is bigger (estimates put it above 350 attendees) and I was surprised to find lots of people traveled all the way from Europe to attend this conference. Looks as if more than 50% of the attendees are new (attending the conference for the first time).
We built an open source portal called OpenCourse a few years ago. I am working on a Learning Portal for XML and chose Plone. You can program Zope/Plone using Python, which is one of the most productive languages I have seen. I looked at Drupal (a PHP based portal framework) and Liferay (a Java based portal) before deciding to go the Plone route. Will share my experience on building the Learning Portal in future entries in this blog.
If you want to check out Plone, these screencasts are the best starting points.
Technical Evangelism is one of most favorite jobs. If I did not have my little startup, I would be jumping up and applying for this one. I think it is cool, because:
- You are interacting with customers
- You are telling them interesting stories (not just what your product does but also why it was built, how it was built and how it is being used
- The questions you get give you great ideas about how your product is being perceived
- You get to do cool demos
- You get to be techie without writing lots of code
Jeff Barr is one of the most engaging evangelists I know. He not only talks about Amazon webservices but also about general trends in the industry and patterns of usage of their web services. I heard him three times and every time, I have noticed packed rooms, lots and lots of questions. What else can you ask for?
While the job is cool in itself, Jeff makes it seem effortless. I think there is some co-evolution of coolness between the job and Jeff.
P.S: I changed the title of the post. I thought it read ok, at first, but later some how felt that what I really envy is, Jeff doing such a great job of his job.
Alex Limi’s session on Friday at PloneConf 2006. Goal is for release in Mar 2007. I am capturing it as Alex is speaking. Plone is the most powerful CMS, I know. You can find more info about this conference in PloneConf Wiki.
- Versioning (History of modifications, replaceable backend, reverting rivisions, diff between revisions)
- In-place Staging( Lets you to work on one piece while another is live)
- Locking (Uses WebDAV semantics, stealable locks, tells you who locked it, and how long ago)
- Easier Sharing (simplerUI)
- Link Integrity (Tracks internal link dependency, warns if you delete resources used by other resources)
- Generalized Previous/Next (on documents and other resources)
- Fieldsets (
- Content Rules Engine (Pluggable, UI for admins to respond to events, really cool for categorizing content etc)
- Portlets Engine (UI for managing portlets, infrastructure to write advanced portlets)
- Indexing support for Word/PDF (out of box)
- OpenID Support(decentralized loing/identity system, lets you use URL as a login, in use by sites like LiveJournal, Technorati, plugins for WordPress, MediaWiki)
- Workflow improvements (Workflow control panel, web publishing, community, intranet and internet workflows)
- All new features using Zope3 (Alex says Plone3 loves Zope3)
- Better markup support (wiki syntax support for all markup, New formats Textile, Markdown)
- AJAX Support (in-place editing, inline validation, improved UI and widgets, makes Plone more efficient to work with)
Portlets in Plone currently are just templates. Folder,User, Group
Will the need to program multi-core systems increase our awareness of existing approaches to concurrent programming? Will concurrency become an extension to the existing popular programming languages, bring about adoption of new or will become a transparent capability in the operating system? Here is an interesting look at Erlang, a language designed for concurrency and distribution in mind.
Our free pass will expire soon, though. Brian Goetz, one of the industry’s top concurrency experts, has written about high-performance Java, including concurrency, for more than a half a decade now. Brian believes that you’re likely to see a significant increase in concurrency problems, thanks to the laws of physics, subsequent demands on programmers, and shortcomings in the current model.
Moore’s Law predicts that the number of transistors on a chip doubles every two years. Semiconductor manufacturers are about to hit some physical limits that will impact their ability to double the transistors on a single wafer, so you’ll see matrices of processors. That change will force more concurrent programming, bringing more developers kicking and screaming into distributed programming.
I am no expert in concurrency, but keenly aware of its importance as we keep getting multi-core systems. Intel is promising 80 cores in a single chip within the next 5 years and investing a lot of money in training developers to program multi-cores. We need languages and tools to build, test and optimize the next generation applications.
Google creates a tool for creating your own contextual search engine. And an addition tool to customize it incrementally. You can create your own, here.
You can customize it in two ways:
- By specifying a set of keywords
- By specifying a set of websites (the search can be limited to these sites or just used for emphasis)
There are a couple of nice features.
- Google Marker allows you to add sites to custom search engine easily. This way you can keep refining your search
- The keywords are used as hints to the search engine. I hope Google will use it simply as a way to establish context to your searches.
- You can allow others to contribute to this search.
I created one for XML for testing it. Works great. I will keep refining it.
Here are a couple of wish list items:
- A recommendation service where new related topics and sites are recommended from Googles database of searches
- An RSS stream for results
- More ways to customize search results
Here is the link to Google’s Blog Entry.
A fascinating entry from Think Blog.
Every era has a company that defines it. We are now in the Knowledge Economy.
A global multi-cultural, fast moving, unwired world with very different needs and challenges. An interesting time to be in the middle of it all.
I found this nice discussion about functional languages in Linspire (thanks to Silicon Valley Patterns Group).
- Erlang: Distributed systems, massive concurrency, anything with lots of network IO (webservers, load balancers), particularly binary network protocols
- ML: Fast. Easy to express complex data structures. Good for theorem provers, compilers, other algebraic manipulation software.
- Haskell: Very strong compiler construction tools. Makes a pretty good scripting language (there’s a thread about a Haskell shell on the mailing list right now). High reliability – I’ve found that 90% of the time my Haskell code “just works” the first time I compile it. Interesting research being done for webapps (WASH) and database integration (HaskellDB).
- Scheme: Fun to make interpreters in. Very good for domain-specific languages. Good language-design research vehicle.
I looked at Erlang a couple of years ago when I read that the language is used by Ericsson for all their Telecom software. It is nice to see it pop up again in discussions.
With the impending popularity of multi-core systems, massive concurrency may really be a great asset. Definitely needs further investigation. A bonus is a summarization on the conceptual size continuum of languages as a part of the comments.
I’m speaking mostly in terms of the number of concepts (both syntactic and semantic) required to learn the language.
By comparison, here’s how I’d rate other mainstream languages on the size continuum:
- Very large: Common Lisp, C++, Perl
- Large: Haskell, Ocaml, PHP, C#
- Medium: Java, Python, Ruby, ML, Dylan
- Small: Erlang, Scheme, Smalltalk, C
I wonder whether this classifcation includes the libraries you need to know to do anything useful.
Here is an innovative way of teaching Computer Science. From Mark Guzdial’s blog:
Our goal is to teach computer science — using robotics and cognitive science as inspiration, but also drawing on computational science and other domains. We’re using robots as a strategy for learning and teaching — a place to draw interesting examples and a way to make the computing concrete and tangible.
Links: Robot Education
The effective programmers instinctively know this. They get bored doing the same thing over and over again so end up builidng their own tool to automate the boring part of their work. From Phil’s Technometria:
As programmers, we ought to be tool builders. Anytime you find yourself doing something more than once, build a tool. Doing so pays big dividends in increasing personal productivity.
Doug Engelbart often talks about the co-evolution of tools and human capabilities. From a thread on blueoxen discussion.
- Co-evolution is the capability of evolving both human and tool capabilities
- Humans make tools
- Tools augment human capabilities
- Augmented humans make better tools and so on
It is the cycle of innovation that is fostered by thinking, inventing, using, improving and thinking about improving.
Ever since I read Mindstorms about 4 years ago, I was fascinated by Mathetic thinking, proposed by Seymour Papert. Papert is a living legend – the inventor of Logo and someone who predicated that students would be using computers in early sixties.
Here is Judy O’Connel’s blog on Global Summit 2006 (what a cool blogid) on Seymour Papert’s points about learning.
- Every kid must have a computer! It is ridiculous to waste further time to debating this. Every knowledge worker (with the exception of our students) finds that technology is the proper medium for thinking work. If knowledge workers have computers, then why don’t kids!
- Shift from HOW to WHAT to learn.
- Recognise that it is global forces that drive change in education. Look to the forces in the global scene, rather than relentless educational debate to find the focus for future learning initiatives.
- Stop talking to the computer industry, and do not accept their economic agenda to spend more in order to buy bigger and better. We should be setting the pace and saying what we need. The $100 laptop project shows the clout that we can have if we wish to really make a difference.
Look at the forces in the global scene. I think every country needs to do that. What a noble goal, to give every child in the world a learning tool – the computer.