Posted on 07-06-2013
I'm working with a variety of tools and services to keep up with my daily research, curation, analysis and ultimately publishing of stories from the world of APIs.
Over the last six months I've migrated to an approach I've called Hacker Storytelling. My goal is to efficiently discover, organize and publish as many meaningful stories around the best practices in the business of APIs, as I can, while encouraging the widest possible distribution as I can.
Hacker Storytelling currently centers around publishing of micro project sites as Github repositories using a simple, blog-aware, static site generator called Jekyll. You can do a lot of storytelling with static markup or markdown via pages and chronological blog posts.
Even with pages and blogs, I need more fuel for my stories. I use Mustache + JSON to display everything from simple bulleted or numbered lists, t company and tool listings. Mustache allows me to maintain a central data store, which I use as efficiently as I can across all the stories I tell.
To help me acquire this data, I depend on multiple APIs, but when it comes down to it, a lot of my data is harvested or scraped. To manage my harvesting I use ScraperWiki, to acquire, cleanup and deliver in a structured data in a JSON format. I maintain a vast archive of data as JSON files, across multiple Github repositories where I use JSON Editor Online to edit in a quick and dirty way. Adding the essential, human element to my curation algorithm.
In addition to my projects, I do a lot of speaking. I have a standard approach to publishing content from my central content and data stores as presentations. Each conference keynote or session I do, as well as presentations for meet ups, hackathons or even internally at various companies is centered around a presentation i custom build at the moemnt of delivery. I use either deck.js or reveal.js for my presentation delivery tool, as opposed to a classic Powerpoint or newer Google Presentation.
Once I create static pages, blog posts and presentations using content and data I've curated and written, I need a place to put it. I usually start with a Github repository, using Git as the central project management platform. After that, if I want a project to have a public life, I will publish to the web using Github Pages, Amazon S3 or Dropbox, depending on my goals around the project.
You can find a list of services and tools I'm currently using on the Hacker Storytelling Toolbox page. I will keep it up to date as I find new tools and services. If there is anything you think I should consider, that contributes to your own storytelling process, please let me know.
comments powered by Disqus
Winning in the API Economy
|Download as PDF|
Latest Blog Posts
- Checkout This Lineup Of Speakers At Defrag, Including Myself
- 6,482 Datasets Available Across 22 Federal Agencies In Data.json Files
- Sorry Google, Your Programming Test Is Not A Valid Measurement Of My Skills
- My Information Consumption Now Driven By Companies And People, Not Just Feeds
- All Government Should Have A Social Media Directory API
- The Color Of Money When Deploying APIs At The Department Of Veterans Affairs
- Taking Web Service Inventory At The Department of Veteran Affairs
- Replacing Legacy Systems With APIs At The Department Of Veteran Affairs
- Never Looking Out The Window, Let Alone Trusting Anyone External Of The Department of Veteran Affairs
- Why I Post Stories To My Blog(s) Like I Do