Get up to speed on K-Logging

July 2nd, 2002

Every company can benefit from K-Logging.

Projects collect and compile hard data and soft knowledge into reports. Often the details of the lessons that are learned through the project are distilled out during the reporting process, such that subsequent projects tend only to benefit from the accrued soft knowledge if largely the same team is reassembled.

Knowledge Logging (K-Logging) is a bottom-up approach to sharing the soft knowledge that is built during the course of a project. Knowledge is both logged and shared continuously at the source, using content management tools which allow easy updating of a shared website. Style is informal, freeform and conversational and the focus is on collaboration and discussion.

The thinking-out-loud style of writing a K-log journal of project activities allows every part of the process to remain available during and after the project. This allows detailed review and enables latecomers to the project to get up to speed. The dead-end attempts that provide the best opportunity for learning are documented and kept for others to learn from.

Since knowledge is continuously shared, teams that are spread out get to share best practices without waiting for summary or review. The knowledge website can also act as a portal to centralize access to other essential project information – scheduling, maps, data storage, email, etc.

There are many tools available for this kind of distributed knowledge management. While prices can range from free to tens of thousands of dollars, cost does not necessarily imply value. There are many open-source content management tools which suit this purpose well.

A good flexible system can be built with Linux, Apache, MySql, PHP and any of a number of content management systems – all open-source software with no licensing costs. Dedicated hardware needs are modest – a used name-brand Tier-1 Pentium-II 300+MHz can be had for less than CDN$200 – a second such machine could provide standby and backup. A high-speed internet connection would be necessary although bandwidth needs would be low. SSL (https) can provide end-to-end encryption for secure access.

Some upfront analysis would be necessary to determine layout, functionality, permissions. Training needs are small – users access the site via a browser and are presented with straightforward editing screens. Ongoing care and feeding needs would depend on selected feature set.

A simple collaboration tool could be built in a day with no capital outlay using space on an inexpensive hosting provider for US$20 or less per month.

A custom knowledge portal could be built with cdn$500 hardware/software budget and consulting hours to suit the needs.

Here is a short and by no means exhaustive list of some of the tools that can be used for K-Logging (and all sorts of content management):

moveable type
radio userland


KLogs group on yahoo
David Gurteen is a good resource for all things related to Knowledge Management. KM is a much bigger topic than K-logging.
Steven Vore has a good KM-based blog
John Robb is on top of the whole K-logs thing at Userland
Pat Delaney uses K-logging to coordinate educators.

Finally, Phil Wolff has a great article about how K-Logging can solve many of the problems that companies face with knowledge management.


Look Ma, no wires!

June 25th, 2002

I finally went out ant got myself an 802.11b card and access point. I got the SMC2632W card and installed it under Windows with the drivers on the disk.

I was dreading the uphill battle of getting it running with Linux, but I fired up Mandrake, pulled the ethernet card out and slapped in the wireless card and PRESTO! Worked right away, no config, no searching for drivers, no nothing.



Hosting Woes

June 22nd, 2002

[ NOTE: 28 Jan 2003 – I’ve changed my tune about PHPWebhosting. They’ve really turned the corner and provide value for money. This particular rant is a slice in time when things weren’t going so well with them.]

I’m not particularly happy with my hosting provider this week.

Blogchat.com was completely down for 12 hours or so last week. Then it was brought back to a readable state where people could use it again, however 72 hours later, we’ve still yet to get back the ability to access the files in any meaningful way to change them, including updating the blog, shelling in for maintenance, changing any files, you name it.

PHPWebhosting.com provides support only through their control-panel interface. There is no email address, no phone number. We accept this because it is a remarkably inexpensive yet full-featured service.

However, despite repeated service requests with specific instruction to make some sort of acknowledgement to us that they are receiving our requests and doing something about it, we remain with our problem unsolved and with no attempt at contact with us.

We have transferred our beta operations to CubeSoft and I’ve temporarily moved my blog to my ashleyit.com site. Luckily I use ZoneEdit for my DNS needs and redirection was simple so archives are not broken. Had we had our DNS with them, we’d be outta luck for another 48 hours while we changed Internic info.

Tim and I have 3 accounts altogether on PHPWebhosting which we may now have to move elsewhere. Lost revenue for them, hassles for us.

All they had to do was acknowledge our existence and tell us they were working on it. I don’t expect miracles from an inexpensive hosting provider – I understand their razor-thin margins – but I do expect common courtesy.



June 19th, 2002

I’ve been having a look at Amphetadesk today.

In conjunction with Les Orchard’s collapsable-channels-and-items skin, I like it quite a bit.

I originally loaded it on Win2000 and browsed to it on the same machine as intended, but decided I wanted to set it up more like my usual configuration.

My current modus operandi for news aggregation is to have Radio always running on my home machine and then I connect to it from wherever I happen to be to get the news page. I can get it from the kitchen, from the office, from someone else’s desk, anywhere.

So, I loaded Amphetadesk up on my Linux box and ran it nohup (that is to say as a constant background service).

I imported the same 39 RSS feeds I use in Radio by simply taking the gems/mySubscriptions.opml file from the Radio directory and overwriting data/myChannels.opml in the Amphetadesk directory and restarting Amphetadesk.

Everything loaded up nicely – 39 feeds, each with multiple items, all nicely collapsed or expanded at my behest.

Worked fine from my Win box at home too – took a while to load the page, but not too bad.

I told Tim via IM to come take a look, and it took him AGES to load. I looked, and of course, since every RSS feed was being not only loaded but also rendered to verbose HTML, the page was over 1.1 Megabytes.

Even after dropping down to only 6 feeds, the page still loaded up at 100k. You can see this won’t scale well at this rate – if I were to load up Jenny’s feeds, I’d be clocking 2Meg of dload per page view. Ouch.

Another issue for me is that feeds are listed by latest downloaded. From what I can see, they’re downloaded every time there’s a scan though, so every feed always seems to have been downloaded at the last scan. I’d rather have their RSS content compared upon download to the existing content and then have the list sorted by last changed rather than last scanned – this would make the recent data percolate to the top, which it currently doesn’t do.

Then one could highlight the recently changed channels, populate them with content, and only load or view the content of unchanged channels on drilldown, either to another page or dynamically filling a div via xml-rpc or remote scripting. This would help the page to not be so big until needed.

Overall, it’s a great hunk o’ code – really nicely written and documented. In an interview , Amphetadesk’s creator says:

It’s not 1.0 yet because I don’t think it’s great enough for a 1.0 release

I have to agree that there are some yet-to-be-added features (e.g. last-changed-time, summary-with-detail-on-demand vs giant-monolithic-page) that I wouldn’t want to be missing in a 1.0 release, but the quality is great, so I expect it once feature-filled to be solid.


Spin Correction

June 18th, 2002

Scott mustn’t use his own IMSaver tool.

Today he quotes me with an opinion on www.webskylines.com:

My buddy Craig Bosko, the guy behind www.webskylines.com, has his new site up. His previous site was very, very flash centric, so much so that Brent, when he saw it, commented to me that he didn’t think Craig could do HTML at all. That is just SO not true that I passed it on.

He attributes that quote to me from this conversation from May 1st (verbatim including typos):

thatbrentguy (09:15:12 PM): I like the look of IMsaver – very slick. that’s
something I’m not good at. at least yet
fuzzygroup (09:15:24 PM): www.webskylines.com
fuzzygroup (09:15:49 PM): Good guy. Great designer. Good friend. Tell him I sent you if you ever need anything. He’ll give you a break.
thatbrentguy (09:16:22 PM): good to know he can do good html design. the flash would have sent me running the other way.
fuzzygroup (09:16:54 PM): Yup. I’m going to copy that comment to him so he knows about that.
thatbrentguy (09:17:15 PM): heh heh. I’ll sltand behind it. the latest
macromedia push gives me the shivers.

It reads a little differently in context. I don’t mind at all being quoted, but when I’m quoted as saying disparaging things about others in their absense, it helps to be precise about what was said.


Healing Karma Broadcast for Dave W

June 16th, 2002

Everybody focus your healing Karma towards Silicon Valley. John Robb reports that Dave Winer is in hospital till next weekend. No more detail than that, but he says Dave will supply detail himself when he is able.


Wanker Management

June 13th, 2002

Dorothea really speaks my language about Wanker Management.


Reader to the rescue

June 13th, 2002

Dorothea has a very complete and useful explanation of the HTML Entity in RSS rendering problem thing.

I’ll remember that for when I do the RSS translator, which will be on hold until I find a freely available machine-translation service that won’t mind my scripts bombarding it.