{"API Support"}

Google Support Buttons

I talked about the gap between developer relations and support at Google, something that Sam Ramji (@sramji) has acknowledged is being worked on. Support for a single API can be a lot of work and is something that is exponentially harder with each API and developer to add to your operations, and after looking through 75 of the Google APIs this weekend, you see evidence that Google is working on it.

While there are many Google APIs that still have sub-standard support for their APIs, when you look at Google Sheets you start seeing evidence of their evolved approach to support, with a consistent set of buttons that tackle many of the common areas of API support. For general questions, Google provides two buttons linked to StackOverflow:

The search just drops you into Stack Overflow, with the tag "google sheets api", and the ask a new question drops you into the Stack Overflow submit new question form. For bug reporting, they provide a similar set of buttons:

The search and report bug buttons drop you into the Google Code issues page for Google Sheets, leveraging the issues management for the Gooogle Code repository--something that can just as easily be done with Github issues. Then lastly, they provide a third set of buttons when you are looking to submit a feature:

Even though there is a typo on the first button, they also leverage Google Code issue management to handle all feature requests. Obviously working to centralize bug and feature reporting, and support management using Google Code--something I do across all my API projects using Github organizations, repositories, and their issue management. I'm guessing Google Support is tapping into Google Code to tackle support across projects at scale.

These support buttons may seem trivial, but they represent a more consistent approach by the API giant to be consistent in how they approach support across their API offerings--something that can go a long way in my experience. It gives your API consumers a familiar and intuitive way to ask questions, submit bugs, and suggest new features. Equally as important, I'm hoping it is also giving Google a consistent way to tackle support for their APIs in a meaningful way, that meets the needs of their API consumers.

See The Full Blog Post

The Relationship Between Dev Relations And Support

I saw an interesting chasm emerge while at a Google Community Summit this last week, while I heard their support team talk, as well as their developer relations team discuss what they were up to. During the discussion, one of the companies presents discussed how their overall experience with the developer relations team has been amazing, their experience with support has widely been a pretty bad experience--revealing a potential gap between the two teams.

This is a pretty common gap I've seen with many other API platforms. The developer relations team is all about getting the word out, and encouraging platform usage and support teams are there to be the front line for support and being the buffer between integration, and platform engineering teams. I've been the person in the role as the evangelist when there is a bug in an API, and I'm at the mercy of an already overloaded engineering team, and QA staff, before anything gets resolved--this is a difficult position to be in.

How wide this chasm becomes ultimately depends on how much of a priority the API is for an engineering team, and how overloaded they are. I've worked on projects where this chasm is pretty wide, taking days, even weeks to get bugs fixed. I'm guessing this is something a more DevOps focused approach to the API life cycle might help with, where an API developer relations and support team have more access to making changes and fixing bugs--something that has to be pretty difficult to deal with at Google scale.

Anyways, I thought the potential chasm between developer relations and support was worthy enough to discuss and include in my research. It is something we all should be considering no matter how big or small our operations are. There is no quicker way to kill the morale of your API developer relations and support teams by allowing a canyon like this to persist. What challenges have you experienced when it comes to getting support from your API provider? Or inversely, what challenges have you faced supporting your APIs or executing on your developer outreach strategy? I'm curious if other folks are feeling this same pain.

See The Full Blog Post

All The Right Channel Icons In Support Of Your API Platform

I look at a lot of websites for companies who are providing APIs and selling services to the API space. When I find a new company, I can spend upwards of 10 minutes looking for all the relevant information I need to connect. Elements like where their Twitter and Github accounts are. These are all the key channels I am looking for so that I can better understand what a company does and stay in tune with any activity, but they are also the same channels that developers will be looking for so that they can stay in tune a platform as well.

I spend a great deal of time looking for these channels, so I'm always happy when I find companies who provide a near complete set of icons for all the channels that matter. Restlet, the API design, deployment, management,and testing platform has a nice example of this in action, providing the following channels:

  • Facebook
  • Twitter
  • Google+
  • LinkedIn
  • Vimeo
  • Slideshare
  • Github
  • Stack Overflow
  • Email

All of these channels are available as orderly icons in the footer of their site. Making my job easier, and I'm sure making it easier for other would be API developers. They also provide an email newsletter signup along with the set of icons. While this provides me with a nice set of channels to tune into, more than I usually find, I would still like to have a blog and atom feed icons, as well as maybe an AngelList or Crunchbase, so that i can peak behind the business curtain a little.

I know. I know. I am demanding, and never happy. I am just trying to provide an easy checklist for companies looking to do interesting things APIs of the common channels they should consider offering. You should only offer up channels that you can keep active, but I recommend that you think about offering up as many of these as you possibly can manage. No matter which ones you choose, make sure you organize them all together, in the header and footer of your website, so nobody has to go looking for them.

See The Full Blog Post

Every Government Agency Should Have An FAQ API Like The DOL

I wrote about my feelings that all government agencies should have a forms API like the Department of Labor (DOL), and I wanted to separately showcase their FAQ API, and say same thing--ALL government agencies should have a frequently asked question (FAQ) API. Think about the websites and mobile applications that would benefit from ALL government agencies at the federal, state, and local level having frequently asked questions available in this way--it would be huge. 

In a perfect world, like any good API provider, government agencies should also use their FAQ API to run their website(s), mobile, and internal systems--this way the results are always fresh, up to date, and answering the relevant questions (hopefully). I get folks in government questioning the opening up of sensitive information via APIs, but making FAQs available in a machine readable way, via the web, just makes sense in a digital world.

Like the forms API, I will be looking across other government agencies for any FAQ APIs. I will be crafting an OpenAPI Spec for the DOL FAQ API (man that is a lot of acronyms). I will take any other FAQ APIs that I find and consider any additional parameters, and definitions I might want to include in a common FAQ API definition for government agencies. This is another area that should have not just a common open API and underlying schemas defined, but also a wealth of server and client side code--so any government agency can immediately put it to work in any environment.

See The Full Blog Post

The Month At A Glance, Road Map, Support, And The Recent Posts For Your API Platform

I was playing with Microsoft's API Catalog, a tool to visualize and analyze the API overlap between standards specifications and type systems within browsers, and their footer caught my eye. I am always looking for quality examples of companies doing platform communications and support well, and I think their layout, and available building blocks in their footer, is worthy of showcasing.

For me, these numbers, and the available communication and support building blocks, send the right signals--that a platform is alive and active. These are the data points I tune into, to understand how well a platform is doing, or not doing. There are no absolutes when it comes to this type of monitoring, as anything can be gamed, but sign Github activity, road map evolution, and blog storytelling can provide a vital heartbeat for any healthy API platform. 

See The Full Blog Post

Managing The ClinicalTrials.gov Support, Feedback, And Roadmap Using Github

Each of the Adopta.Agency project runs as a Github repository, with the main project site running using Github Pages. There are many reasons I do this, but one of the primary ones is that it provides me with most of what I need to provide support for the project.

Jekyll running on Github pages gives me the ability to have a blog, and manage the pages for the project, which is central to support. Next I use Github Issues for everything else. If anyone needs to ask a question, make a contribution, or report a bug, they can do so using Github Issues for the repository

I even drive the project road map, and change log using Github Issues. If I tag something as roadmap, it shows up on the road map, and if I tag something as changelog, and it is a closed issue, it will show up on the change log--this is the feedback loop that will help me move the clinical trials API forward.

See The Full Blog Post

Breaking Out API Support Into A Separate Research Area

Supporting your community is not unique to the API space, but supporting API operations does have some unique needs, and approaches that are proven by leading platforms. Like other areas of my research, I'm pulling out API support into its own area, so I can start shining a light on successful patterns I find in the area of API support.

Two things pushed me to spin out this research area. One, I was tagging more blog posts, and other resources as support, and without a dedicated research area, this information would rarely float to the surface for me. Two, my partners over at Cloud Elements have an API hub dedicated to "Help Desk". While their aggregate API solution is targeting beyond the API community, it is API driven, and can also be applied to providing an aggregate support solution for API communities.

As with most of the areas of the API space, there are several dimensions to how APIs are being applied to support customers, and online communities. With my research, I will focus on tracking on approaches to community support for API providers, and API service providers. There will also be that layer of tracking on help desk and support platforms who employ APIs, as well as API aggregate and API interoperability solutions from leaders (and my partners) in the space like Cloud Elements.

You can visit my API support research via its Github repository, and I will try to make sure and continue linking to it from my API management research, where it was born out of.

See The Full Blog Post

Update 265 Pages, and 175 Links On My Network To Support Swagger to OADF Shift

I have written 265 separate posts, across the API Evangelist network about Swagger, in the last three years. To reflect the recent shift of Swagger into the Open API Initiative (OAI), and the specification being reborn as Open API Definition Format (OADF), I wanted to update all the stories I've told over the years, to help educate my readers of the evolution, and provide the most relevant links possible.

Using my Linkrot API, which helps me manage the links across my network of sites, I've identified all the pages with Swagger relevant links, and made sure they are updated to point at the most recent material. I've also added a banner to each of the 265 posts, helping educate readers who come across these posts, regarding the transition from Swagger to OADF, and help them understand where to find the latest information.

My network of sites is meant to be my workbench for the API space, and provide the latest information possible about what drives the API sector. It is important to me that the information is as accurate as possible, and my readers stay in tune with the shifts of the APIs space, and where they can find what they need to be successful.

In the end though, all of this is really just business. 

See The Full Blog Post

Using Existing Online Forums vs Developing Your Own To Support API Operations

As I tune into the fallout around the Reddit community, I think it is a good time to pull a story out of my notebook, that I began writing a month or two ago. These thoughts are born out of my post Ask The Stack When You Need API Support, where my friend Jeremiah Lee (@JeremiahLee) commented, disagreeing with my recommendation that API providers use Stack Exchange as part of operations. 

Jeremiah shares his story about how hostile Stack Overflow can be (I'll let you read his comment on your own), something that has been re-enforced many times along the way. While I still endorse Stack Overflow as a valuable building block for some APIs, I completely agree with Jeremiah about the overall community tone. While Stack Overflow is definitely a different beast, it suffers from some of the similar systemic illnesses, that Reddit does. I find some very valuable information on Stack Overflow, which I use regularly, but if you are looking to develop an API community, I can envision the community potentially working against you, in several ways.

It really depends on your API, and the target audience you are trying to reach. Sometimes, the male dominated, "gamified community of meritocracy, where established members have many rules and politics", as Jermiah says, is exactly who you should be engaging, but you really need get to know the personae of your ideal customers, and decide on your own. For API Evangelist, both my storytelling, and the APIs I provide, I've long felt Stack Overflow is not my target audience.

Similar to Reddit, Hacker News, and DZone, I get very little value from these communities. While I do find nuggets of information at these places, I do not post my stories, and engage in conversations there, because the hostility that can come from these channels I have found outweighs any value they might bring in traffic. I seek a much wider audience, than the individuals who often dominate these tech hangouts.

I encourage you to think deeply about who you want to reach with your APIs, and while building your own community in your own wiki, forum, or other solution, might take a lot of time and work, it gives you greater control over the tone your forum takes--which might make the difference in attracting the right developer and business audience you seek.

See The Full Blog Post

@Broadcom, I Am Going To Need Your Switches To Support Virtualized Containers So I Can Deploy My Own APIs Too

While processing the news today over at API.Report, I came across a story about Broadcom delivering an API for managing their latest network infrastructure. The intersection of Software Defined Networking (SDN) and Application Programming Interface (API) is something I’m paying closer attention to lately. Hmmm. SDN + API = Brand New Bullshit Acronym? Meh. Onward, I just can’t slow down to care--{"packet keep moving"}.

At the networking level, I’m hearing Broadcom making a classic API infrastructure argument, "With the OpenNSL software platform, Broadcom is publishing APIs that map Broadcom's Software Development Kit (SDK) to an open north bound interface, enabling the integration of new applications and the ability to optimize switch hardware platforms.”, with examples of what you could build including "network monitoring, load balancing, service chaining, workload optimization and traffic engineering."

This new API driven approach to networking is available in the Broadcom Tomahawk and Trident II switches, looking to build up a developer community who can help deliver networking solutions, with Broadcom interested in giving, "users the freedom to control their technology, share their designs and boost application innovation.” Everything Broadcom is up to is in alignment with other valid Internet of Things (IoT) efforts I’m seeing across not just the networking arena, but almost any other physical object being connected to the Internet in 2015.

I think what Broadcom is doing, is very forward leaning effort, and providing native API support at the device level is definitely how you support “innovation” around your networking infrastructure. To keep in sync with the leading edge of the current API evolution as I'm seeing it, I would also recommend adding virtualized container support at the device level. As a developer I am thankful for the APIs that you are exposing, allowing me to develop custom solutions using your hardware, but I need you to take it one level further--I need to be able to deploy my own APIs using Docker, as well as working with your APIs, all running on your infrastructure.

I need your devices to support not just the web and mobile apps I will build around your around your hardware, and the API surface area you are providing with the new Tomahawk and Trident II switches, I need to also plugin my own microservice stack, and the microservices that vendors will be delivering to me. I need the next generation of switches to be API driven, but I also need to guarantee it is exactly the stack I need to achieve my networking objectives.

That concludes my intrusion into your road-map. I appreciate you even entertaining my ideas. I cannot share much more details on what I intend to do with your new SDN & API driven goodness, but if you work with me to let me innovate on your virtual and physical API stack—I feel that you will be surprised with what what the overall community around your hardware will deliver.

See The Full Blog Post

Ask The Stack When You Need API Support

I was profiling the video sharing API Dailymotion the other day, going through their developer area and profiling their API operations. One of the things I do as part of the profiling of any company, is checkout how they execute their support.

Dailymotion employs two very common building blocks for their platform support, including an API specific Twitter handle, and good ol fashion email—both pretty proven approaches. However Dailymotion also employs a third aspect to thier API support, by recommending you head over to Stack Overflow for some community support.

Using Stack Overflow in this way is not that original, I see numerous API providers doing this, the part I found interesting was their reference to getting Dailymotion API support via Stack Overflow as, "ask the stack!" I like that, I think it reflects what Stack Overflow is to the API developer community, and was an elegant way to send API developers off your site, to get the support they need.

See The Full Blog Post

APIMATIC Code-Generation-as-a-Service Has Built-In Support For API Commons Manifest

The API savvy folks over at Apimatic are at it again, pushing forward the conversation around generating of software development kits, using machine readable API formats, and this time the doorway to your SDK is via the API Commons manifest.

I'm going to go ahead and use their own description, as it sums it well, no augmentation needed. Using the code generation API, you can generate SDKs for your API directly from your Github repository. 

Step 1: Describe you API using some format. You may choose from Swagger, RAML, APIBlueprint, IODocs, and Google Discovery formats. Automatic code generation makes use of information in your API description to generate method and classes names. Please be as expressive as possible by not leaving out any optional fields as applicable e.g., not leaving out types and schemas for your parameters and fields.

Step 2: Define meta information about your API using API Commons manifest format. You can generate your API Commons manifest using the API Commons Manifest generator. Be sure to enter all relevant information. Upload the generated manifest as a new file in the root directory of your Github repo by the name "api-commons- manifest.json". Be sure to have the correct name and location of this file.

Step 3: Open/Create a markdown file (README.md is a good candidate). Add the following markdown syntax to render an image link.

[![apimatic][apimatic-{platform-name}-image]][apimatic-{platform- name}-url]

[apimatic-{platform-name}-url]: https://apimatic.io/api/github/{account-name}/{repo-name}/{branch- name}?platform={platform-name}

[apimatic-{platform-name}-image]: https://apimatic.io/img/github/{platform-name}.svg

Replace the {platform-name} token with one of the following values: windows, android, ios
Replace the {account-name} token with the name of your url-encoded Github account name
Replace the {repo-name} token with the name of your url-encoded Github repository name
Replace the {branch-name} token with the name of your url-encoded Github branch name where the API Commons manifest file is present. 

To validate, open the following url after replacing tokens. This url should open the raw manifest file. https://raw.githubusercontent.com/{account-name}/{repo-name}/{branch- name}/api-commons-manifest.json

You can see an example here. Commit changes and navigate to your Markdown file in your browser. You will see apimatic widgets (image links), which you can click to generate SDKs for your API. To see an example, open this link to view the README.md file in raw text form.

The Apimatic team is owning the conversation when it comes to generation of full fledge SDKs for your APIs. I always hear folks talk about the limitations of auto-generation of client side code, but the Apimatic team is pushing the conversation forward with their persistent approach.

See The Full Blog Post

Why Would You Ever Give Students API Access To The Student Information System (SIS), And Let Them Build Un-Sanctioned Apps That We Will End Up Having To Support?

I went up to California State University Channel Islands the other day to talk APIs with their tech team, and I was happy to find at least one strong API skeptic on the team. API skeptics also give me material for stories, so I thoroughly enjoy coming across them, and telling these stories is how keep polishing my argument for the next API skeptic encounter at campus IT, at the higher educational institutions that I visit.

During the discussion I was posed several interesting questions, and one of them was: why would you ever give students API access to the Student Information System (SIS), and let them build un-sanctioned apps that we will end up have to support?

Family Educational Rights and Privacy Act (FERPA)
FERPA gives students the right to review, control disclosure, and request amendment of their education record. Increasingly this is going beyond just via a web interface, PDF, or printed copies. President Barack Obama mandated that all federal agencies begin providing information in machine readable formats, and many cities and states are putting it into law as well. A student should always have access to their data, and they should be able to choose to do this via campus applications, or be able to obtain a portable copy of their record for storage in a location of their choosing, or possible use within a 3rd party system of their choice—it's their data. Period.

Un-Sanctioned App Concern Is Just A Red Herring
Modern API management infrastructure like 3Scale, and WSO2, provide an unprecedented level of control over managing API access, requiring secure on-boarding of new developers, the establishment of service composition definitions, which provides rich real-time analytics on how APIs are used, and by whom—while also seamlessly integrating with existing identity and access management solutions. The university gets to choose who has access to which services, revoke access when abused, while also better understanding how resources are really being accessed and put to use. Ideally this applies to all campus-wide usage, as well as with external 3rd parties—modern approaches to API-centric operations, include the management of internal, partner, and public resources in this way.

A More Balanced Governance Across Campus Resources
Modern API management was born out of traditional IT governance, but is something that is more focused on giving access and control to the end-users who are the actual owners, of the valuable data, content, and other digital resources being made available. Legacy campus IT models provide a governance model that involves IT, administrative and faculty stakeholders, but rarely includes the students. APIs give students secure access to their data, and standards like oAuth opens up the ability for them to have a vote in who has access to their data, with oAuth scope being defined by existing institutional governance efforts. When APIs enter the conversation, governance expands to be more self-service, real-time, and within the control of students, as well as administrators, faculty, and campus IT.

Possibility Of Good Things Happening Closer To The Student
In the current educational environment, where students are often more tech savvy than faculty and administrators, why would we want to eliminate serendipity, and the possibility that new things might happen. Solutions to problems that students actually face everyday, that campus administrator may never think of, because they see technology through a very different lens. The days where IT knows best, regarding what devices, browsers, apps, and websites are optimal for getting things done, are in the past. Shadow IT demonstrates this, where students, and even faculty are using un-sanctioned solutions to get their work done. Campus IT should be empowering students, encouraging a more digitally literate individual who soon will be entering the workforce, not suffocating this.

Easy For Campus IT To Miss The Big Picture
I am a recovering IT administrator, so I understand the challenges a skeptic campus IT administrator faces, but ultimately by restricting access to campus resources just makes your job harder, making you the bottleneck that everyone's so commonly complains about, when it comes to IT. APIs don’t create more work for you, it makes you much more agile and nimble in how you integrate systems, build new web and mobile applications, and provide 3rd party vendors access to campus resources—as well as potentially opening up self-service access to students.

With an API-centric approach you will know exactly who is accessing resources, and how they are using them--in real-time. I’m betting that you don’t have this visibility across all of your IT resources right now. When I put on my IT director hat, I prefer the API model, because then all resources are self-service, available to only those who SHOULD have access, all without them having to talk to me. I’m left alone to do what I do best, and can also monitor new signups, real-time usage, and manage support tickets in accordance with wider IT support strategies.

I understand you are a skeptic about APIs being a thing students should have access to, and in reality most students will not care, but there will be a long tail of student users who will do things you never imagined, and potentially change how you look at scheduling, document management, or other staples of campus operations—something that will never happen if you don’t make students a priority when it comes to your digital resource management.

Disclosure: 3Scale and WSO2 are both API Evangelist partners.

See The Full Blog Post

Developer Support With Google Helpouts

I was cruising around the Google Developer area and I stumbled across Google Helpouts. A service being billed as “Experts with Answers, Meet Developers with Questions”. Seems like a more one-to-one version of Stack Overflow, where you can get the answers you are looking for when it comes to development on the Google platform, but in a more direct, one-to-one way.

Inversely, you can contribute your developer skills, and be one of the knowledgeable developers who are giving back to the developer community. The Google Helpouts page describes it as: Give Back - Continue Learning - Build Your Reputation. I signed up for an account, gave my profile, but the tags that I was able to connect with my profile really didn’t match my skills. It seems very programming language, and platform focused at this point--not really for APIs.

At first glance I thought Google Helpouts was more about Google APIs, available as an exentsion of the Google Developer program, but it seems to be a more tech focused, skills matching service. I think the same concept would work well, if applied to APIs, but I don’t see any reason for a new platform, just build it on top of Stack Overflow API—why re-invent the wheel?

See The Full Blog Post

Sales, Onboarding And Support In A Self-Service API World

I was reviewing an API over the last couple of weeks--I signed up for an account, came back several times, and made a handful of API calls in hopes of learning more about how the API works. This is something I do a lot, and it is always interesting to experience the on boarding process (or lack of) for APIs.

I first signed up about two weeks ago for this particular API, and within 48 hours I received an email asking if I needed help with my integration--that was nice of them. I like getting an email from the provider, and the more human it is, the better. At this point I didn't need any help, I’m just playing, learning, and depending on the self-service resources made available to me via the API ecosystem.

About a week later I get another email, again asking if I need help. At this point I’ve put down the API, but when I pick back up I might respond to the platform then, but I just have more learning to do before I have questions. Then about a week later, right before I was about to pick up the API again, I got another email asking me what my plans were, putting more pressure on me to share my plans for how I was going to be using an API, which I'm just not sure yet.

I am not your usual API consumer. I know this. However this API, I was actually planning on integrating into my core API tracking system at some point, so in addition to being the API Evangelist who might write a story, there is a good chance I will become a customer. I really like, and believe in the self-service nature of APIs, and while I like getting an email letting me know someone is home, I tend to be turned off by each consecutive email--nothing reminds you are in someones sales funnel, like a series of emails.

When you consider the touch points for your API on-boarding flow, make sure and think about the different types of users who will be registering, and the fact that not everyone will fit squarely in your perfect funnel definition. You want to make sure your API consumers know that someone home, and that you are there to help, but you really should rely on your self-service resources to get the bigger job done. Your initial email after I sign up should do this, then leave the next steps up to me, and be very thoughtful, and possibly dynamic with each engagement after that.

See The Full Blog Post

My Continued Support As Signer Of Oracle v Google Amicus Brief From EFF

As the Oracle v Google API copyright case was on its way to the Federal Circuit Court in 2012, the EFF reached out to me for help in crafting stories of how important it is that APIs remain free of copyright, ensuring they remain open and interoperable. I shared three stories, one on cloud computing and AWS APIs, the second on Delicious APIs, and the third on Instagram APIs, all reflecting three different scenarios that would never have happened if APIs were copyrightable.

A couple weeks ago EFF reached out again, asking for my signature again, on another Amicus Brief to support Google’s request of the Supreme Court to review the circuit courts decision, and reverse it. As it stands right now, there is a precedent that copyright can be applied to APIs, and even though the case itself is moving onto the question of fair use, if we let the current decision stand, other companies can follow Oracle’s damaging lead and sue for protection of their APIs--which is why we need to convince the Supreme Court to review, and overturn the very damaging decision. 

We expect that the Supreme Court will decide whether or not to grant Google’s petition to review the Federal Circuit’s decision sometime in January or February of 2015. So I need your help to stir up buzz around the issue, and post stories on your blog, call your congressman, and light up any other channel you can, to help educate people about the importance of the issue. When the Supreme Court takes on the case (which I feel strongly they will), we will need to regroup, and refile, a more expansive brief on the importance of APIs remaining free of copyright. 

I’m working to expand my restaurant menu analogy, to help people understand the importance of APIs, but if you have other analogies that you think would help the Supreme Court understand the separation of API interfaces, and their supporting server side or client side code, please share with me so I can work with the EFF to potentially include in the next brief that is filed. APIs are touching every sector of business in 2014, and if we allow Oracle’s copyright claim to stand, we are in danger of pouring glue into the gears of each of these business sectors, at a point in time where we need to introduce as much lubrication, and transparency as we possible can, to ensure that the web, mobile, and Internet of Things applications built on APIs remain open and interoperable--ensuring they serve not just their platform owners, but also developers, and end-users equally.

Join me, in helping bring awareness to the issue, which right along with Net Neutrality, is one of the most important issues we currently face when it comes to deciding the future of technology, and how our society works, shares, collaborates and interoperates in this new online digital world we have created for our world.

See The Full Blog Post

Wunderlist adds Dropbox support to help you better manage your to-dos

The mobile and desktop to-do list app Wunderlist has added Dropbox support, allowing users to automatically attach files stored in the cloud directly to tasks. The company said the move is the first of many integrations of third-party productivity tools and that Dropbox support in particular was one of the most requested features. To get started with it, all you need to do is click on the Dropbox item in detail view and select the file you want to add. If you then change that file in some way, those changes will automatically sync back to the file attached to Wunderlist, negating any need to reattach it. A spokesperson for the company confirmed that the new feature is avaialable on Wunderlist for the Web and Android from today and that it should land for iOS devices just as soon as Apple approves the updated app in the “next few days”.

URL: http://feedproxy.google.com/~r/TheNextWeb/~3/BqBAAUXu4gA/

See The Full Blog Post

Explaining My Work Around APIs In Higher Education To Institutions

I’m needing to quantify the work i do around APIs in higher education for a university in the U.K., so I figured I’d craft into a story that I can share with my readers, and potentially other schools who would ask what it is that I do.

I am interested in APIs in higher education because I feel strongly that our institutions are a fertile environment for ensuring that the next generation of our society possess the digital literacy they will need to navigate and find success in our increasingly digital world.

When I mention that higher education institutions should be incorporating APIs into daily operations, most folks immediately think of a very technical, IT directed effort, which is one layer to the discussion that should be considered, but ultimately it is about developing an awareness, and engagement by administration, faculty, and the students with the increasingly public APIs that surround us, as well as internal institutional APIs.

Our personal, and business lives are moving into the cloud. We are increasingly dependent on web-based and mobile applications in many aspects of our lives, and a growing number of these software services have APIs, and API-enabled actions that can be taken by anyone, even non-developers.

My mission as the API Evangelist is to help everyone be aware that APIs exist, and that they are there to assist you in your personal and business life. Which brings me back to higher education institutions being an important place to expose students, and faculty to the benefits of APIs—in preparation for the increasingly API driven world unfolding around us.

While I pay attention to over a 100 categories of APIs, let’s take one area, photos and images, and use as a backdrop for what I do. I pay attention to over 25 photo and image APIs that provide image related services for developers, as well as direct access to the popular photo and image platforms we often depend on like Flickr and Instagram.

As the API Evangelist I seek to pay attention to three key layers of the API space.

Technology of APIs - The geeky detail of how APIs work.

  • How do I authenticate?
  • Where do I read data from?
  • How do I write data?
  • Where are their code samples in my language?
  • Where is the WordPress plugin?

Business of APIs - The business profile behind any online service.

  • How much does the service cost?
  • What are the limitations of what I can read or write?
  • What kind of support is available to me?
  • What are the long term plans of company?
  • Is the platform and tools well documented?

Politics of APIs - The often complex politics behind the curtain.

  • Do I own my data? Do I own my photos?
  • Can I specify the license for my photos?
  • What are the security practices?
  • Do I have control over who access my data (aka oAuth)?
  • Are code libraries, tools, and other code open sourced?
  • Are the terms of service easy to understand? Fair?

APIs are much more than just technology and when you compare platforms like Flickr and Instagram which both act as photo sharing platforms, but have widely different approaches to the technology, business, and politics of each of their APIs. Sometimes it is just small details that can make the difference in whether or not a platform is the right choice or not for any individual or business.

Beyond the technology, business, and politics I also pay attention to a handful of important trends that are growing out of the API space, to support API consumers.

Realtime - What options are there to interact in realtime.

  • How do I send / receive push notifications?
  • How do I use webhooks to receive updates?
  • Does this API have a streaming API?
  • What frameworks are available for real-time delivery?

Aggregation - Details on API data and information aggregation.

  • Can I get all of my images from Flickr, Instagram in a single feed?
  • Can I publish to one place, and have it syndicate out?
  • What options do I have for POSSE (Publish (on your) Own Site, Syndicate Elsewhere)

Reciprocity - Approaches to moving data from one system to another.

  • Can I migrate all of my flickr images to dropbox?
  • Can I keep all of my flick, instagram, and other photo sites in sync?
  • Where is the best place to store my photos?
  • Can I transform images when moving from location to location?

I’m not just about watching, and monitoring what is going on in the API space, my mission demands that I take what I learn and produce content that everyone else can benefit from. To fulfill this, I take all of my curation, analysis and awareness, and I work to create rich content that is designed for the widest distribution possible:

  • Short Form - Blog posts on my primary blogs (apievangelist.com, apivoice.com, api.report, theapistack.com, ipaevangelist.com, and kin lane.com)
  • Long Form - White papers and ebooks that get published either public, or in some cases privately for internal organizational distribution.
  • Presentations - Walkthroughs, talk presentations, and other interactive content that introduce people to the world of APIs.
  • Videos - Generating video content, around my research and presentations for publishing to YouTube and other video distribution sites.

I strongly feel that it isn’t enough that I'm doing all of this work online, I’m also invested in stimulating in-person conversations in the following formats:

  • Conversations - In person, small group conversations with team members about APIs.
  • Classrooms - Teaching large, or small classrooms at institutions around the world.
  • Meetups - Supporting the creation of, and guest speaking of API focused meetup groups around the world.
  • Conferences - I support up to 10 API conferences, while also putting on my global conference on APIs called @APIStrat

API evangelism for me is not just about educating developers that APIs exist, it is about bringing awareness to the masses that APIs exist, and they are just the next evolutionary layer of the web that has penetrated almost all aspects of our lives. I focus on bringing an awareness of APIs to:

  • Individuals - Every citizen. API literacy is like financial literacy, you don’t need to understand the entire banking system, but you will need some basic financial literacy to manage your bank account, and credit cards—the same applies in the API space, you don’t need to understand everything about oAuth, but you should know who has access to your bank account, Facebook, and YouTube accounts.
  • Business - More business activity is occurring online, and APIs are rapidly making business data and information available to the average business user for use in spreadsheets, mobile and web applications. Basic API literacy is fast becoming a requirement for many business sectors.
  • Institutions - Over the last 15 years, many institutions have moved online, establishing a web presence, and connecting with partners, and constituents via the web. Large organizations need to understand how to use APIs to become more resilient, agile, and nimble in an age where much is changing when it comes to how the institutions of yesterday are perceived and stay relevant in the future.
  • Government - Our government is required to do more with less, and APIs are the way we are slowly shifting government of all sizes, operate and engage constituents. Federal, state, and local governments are opening up data, making resources available via APis, and shifting the burden of governance to private sector partners, and even the public using APIs.

Helping everyone understand the API undercurrents that are occurring right now all around us, is my mission as the API Evangelist. Everyone needs to understand that APIs exist, and have a basic understanding of how they can put them to use in their personal and business world. This isn’t some grand, techno vision being sold from Silicon Valley that I'm looking to sell to universities, this is about what is already happening around us on the web, via our personal mobile devices, our homes, businesses, cars, and much much more. API Evangelist is about helping regular folks see what is happening, understand as much of it sa they can, allowing them to take meaningful action with the context of their lives.

Continuing to bring this higher education API venn diagram tighter, is additional work I’m doing with in two very quiet, but extremely important initiatives.

  • Domain of One’s Own - A program occurring at a handful of higher education institutions, where students get their own domain upon entering school, which can be used for their projects throughout their academic career, and even beyond if they choose—giving each student the gift of web literacy.
  • Reclaim Your Domain - A secondary project spun out of Domain of One’s Own which is about providing education materials, tools, and other resources for students, and any other individual to map out their online domain, and put together a strategy for reclaiming parts of their digital identity.

To do my work I use my own custom developed internal system, where I aggregate information about over 2500 APIs, 1500 RSS feeds, 2500 Twitter accounts, and 750 Github accounts. From there, all of my short form and long form content, as well as presentations, and other supporting research gets published to 75 individual open source research projects managed using Github. My intent is to make sure all of my research, and resulting data is publicly available in open formats, allowing anyone to fork, extend, and expand on what I do.

How can my research help your institution better understand, internalize, and apply APIs?

You can visit my research on APIs in higher education here:

Here are some other recent stories I've published recently in the area of APIs and education:

See The Full Blog Post

The API Focused Dev Shop

I tag a lot of interesting companies that show up during my weekly API monitoring. When I see a tag go from 1 or 2, to over 5 companies--I take a closer look to see what is going on. An increase in the number of companies focusing in a specific area could be a trend, or it could be nothing.

The tag "API Agency" ticked over to 6 today, when I added Aquevix, an indian company that is focusing on API development. As of August 2014 I now have six separate agency style companies that I've found who have a focus on API design and development:

6 Companies
API Support

Developer Support Beyond FAQ, Forums and Documentation. First class support for your API. Processing an API request often means directly or indirectly interacting with 2 or more systems. Is your support team equipped with the necessary training, tools, information and support infrastructure to be successful? IT Assist helps you design and implement your API support strategy and infrastructure, offer training to your support team and as needed handle your developer support.


We are a small Independent software business specialising in the. Enhancement of existing software. We use standalone applications and APIs to extend the functionality of existing systems. Integration of separate systems. We connect disparate online SaaS applications together to create an integrated system. Automation of business processes. We convert manual menial tasks into automated business processes, reclaiming time and costs.


We design market strategies for companies looking to extend their APIs into digital partnerships. API Strategies to accelerate social and mobile content and digital partnerships. Big data asset definitions and valuations to define API approaches. API pricing and tier strategies to monetize data. Developer and partner outreach strategies to expand mobile success and "mash-ups”. Business model development for emerging start ups and matching them to enterprise clients and needs. Best practices to compete and win in the social, mobile, and big data marketplaces.


Aquevix is the go to company that can weave the abstract into a finished product. We are a software company that provides innovative, business solutions worldwide. Need an API? We got you covered. We provide full REST/JSON based API implementation and related apps with best practices! We are capable of developing highly specialized APIs that integrate seamlessly with powerful apps and increase overall performance of applications.

Blue Jazz Consulting

Blue Jazz Consulting is focused on guiding product companies from idea to revenue using Ruby and Rails and Java-based technologies. Based in Austin,TX, we bring experienced professionals and a history of successful projects to the community. We offer expert consulting services for existing products and early-stage startups. We offer expert consulting and development in software architecture, specifically on the backend requirements of complex B2C and B2B applications. We love to model, design, and build web and mobile APIs to ensure that your business capabilities can be consumed successfully by internal developers, devices, and third-parties using a Ruby or Java platform.


At Stateless we employ our specialist experience, design processes and tools to ensure our clients realise the most value from their APIs. We're making the world of APIs beautiful.We work with you to surface the business goals for the API. We decide on the metrics to measure and track. These will help in understanding whether we are achieving our goals. We create developer personas to represent the various types of consumer, and a roadmap to deliver features tailored to them. We offer engineering and product management resources delivered using the best modern processes and tools. We employ Lean practice by tracking important metrics and constantly feeding that insight back into the live roadmap. We provide technical writers, and use the latest tools and techniques to interlace your test suite and your documentation.

We'll see how many more dev shops I find in the next couple months. Sometimes when I start focusing in a new area I will find more companies working in the same area, purely because I'm looking harder, and sometimes it takes time for things to actually heat up.

As I'm looking at local Chicago web and mobile development shops during the lead up to API Strategy & Practce in Chicago next month, and I'm learning that many small shops like SYDCON and Blaze Portfolio are using APIs, they just havn't shifted their marketing and web site content to reflect the evolution.

It won't take much for many of these web and mobile development shops to standardize their API design and development services, much like the evolution we saw occur from web to mobile development, shifting the focus of the small dev shop to better serve a growing demand for API design, and development. I think by 2015 I'll see over 50 development shops, who are heavily focused on API design and developmennt in my API tracking system.

P.S. I REALLY like the API Chappies, and Mike @ Stateless is the freak'n man!

See The Full Blog Post

Where The Good IPAs Are In Chicago While At API Strategy And Practice In September

In preparation for API Strategy & Practice in Chicago, September 24-26th, I did a little research on where the good beers, and specifically the kick-ass IPAs can be found. You may not know, but in addition to being the API Evangelist, I am also the IPA Evangelist (plan b career path), and I'm always interested in knowing where the killer IPAs are, in addition to knowing where to find the best APIs. 

While in Chicago we want to be able to have the tasiest beer possible at the conference, while also having the best options for finding good beer and food after the event to network and socialize with the 600+ folks that will be at #APIStrat. To prepare for #APIStrat I found 32 local Chicago breweries:

5 Rabbit Cerveceria

The first Latin microbrewery in the US, 5 Rabbit is all about making the best beer.

Ale Syndicate

At Ale Syndicate, we are dedicated to making craft beer worthy of Chicagoans. Please drink responsibly! ALE SYNDICATE BREWERS, CHICAGO, IL BEER.

Argus Brewery

The Chicago beer you SHOULD be drinking. Our Chicago Attitude is something of which we're proud. Something we think you'll taste in each Argus Brew-flavor, depth, the unusual and carefully brewed taste of a premium craft beer..

Atlas Brewing Co.

Atlas is a brewery and restaurant committed to providing Chicago with fresh, delicious beer and food to match. And we have a bowling alley attached....


First bred in 1989, Baderbräu is back and here to stay. The return of Chicago's original craft brew. We also hope that you'll try our future offerings, with equally high expectations, and become fans of those to.

Begyle Brewing

a Community Supported Brewing Company! Visit us at our brewery store for growlers and packaged beer to go! 1800 W. Cuyler, Chicago, IL 60613.

Berghoff Beers

Berghoff’s history goes back over 120 years and, in that time, has developed a reputation of high quality and consistency that you’d expect from a hard-working men and women dedicated to their craft..

Big Shoulders Beer

Big Shoulders has taken the time to find the right connections and people to be committed to bringing you great beer..

Cahoots Brewing

Cahoots Brewing is a new brewery in Chicago. Our first beer, No S'more Imperial Stout, is out now. Our goal is to build beers together..

Chicago Beer Company

It's Chi-Time! Chicago Beer Company Craft Beers are available in IL.IN.MI. Wheat Rated 90pts GM; Lake Shore Lager 91pts GM & Pale Ale 88pts SM.

Dryhop Brewers

A brewery and kitchen in East Lakeview, Chicago. Tweets by @gregshuff @brantdubovick & @ecgarrity Cheers..

Finch's Beer Company

Finch's Beer Co. is in the business of brewing great, craft beers locally in Chicago, IL..

Forbidden Root

At Forbidden Root, we craft delicious botanical beverages for today's sophisticated thrill-seeking palates..

Goose Island Beer Co

Since 1988 Goose Island has innovated what beer can be. Follow us to see What's Next. Cheers from Chicago..

Half Acre Beer

We brew many beers throughout the year. Our focus is brewing raw & basic ales & lagers rich in material and undisturbed by process..

Hamburger Mary's

Hamburger Mary's is an open-air bar & grille for open-minded people.... That brews it's OWN BEER! Yumm :-).

Haymarket Brewing

Haymarket Pub & Brewery opened in December 2010, and features classic Belgian and contemporary American beer styles from brewing legend Pete Crowley..

Hopothesis Beer Co.

We’re relentlessly focused on making great craft beer that delivers a flavorful, approachable, balanced drinking experience for geeks and non-geeks alike..

Lake Effect Brewing 

Our name refers to the weather phenomenon most famous for its legendary snow storms. More importantly, the lakes create their own climates where the water temperature promotes an ideal environment for growing barley, wheat, fruits, and hops..

Local Option

The Local Option offers over 30 unique and rare beers from around the world on tap (with many more in bottles), intense Creole food, and a lively atmosphere indicative of the robust beer we proudly serve..

Metropolitan Brewing

Each year, we brew one 30-barrel (that's sixty half barrel kegs) fermentation vessel of Zwickelbier, meant to be enjoyed raw, cloudy, and as fresh as humanly possible. We package it, ship it, and bars across Chicagoland put it on tap pretty much the minute they receive it..

Moody Tongue Brewery

At Moody Tongue, our goal is to create thoughtful, exciting beers that blend familiar flavors and quality ingredients..

Moonshine Chicago

Moonshine is an edgy, urban roadhouse conveniently located in Chicago’s fashionable Wicker Park neighborhood..

Off Color Brewing

We make beer. Sometimes we do other stuff but not as well.


Brewing award-winning beer and serving delicious New Haven style pizza. Sports, live music, and Live band karaoke every Sat @ 11PM..

Pipeworks Brewing Co

Pipeworks Brewing Co. is a Chicago Brewery with a focus on creative small batch beers, brewed with expertise, passion and rock n roll..

Revolution Brewing

Revolution Brewing is Chicago's new hometown craft brewery. Brewpub in Logan Square & Brewery in Avondale..

Rock Bottom Brewery

Passionate about pints. Maniacal for malts. Rock Bottom is always brewing. http://t.co/EfG9h8gWJA.

SlapShot Brewing

Brewing small batch hand crafted ales in Chicago, with a focus on session beers..

Spiteful Brewing

Chicago nanobrewery located in North Center. Fine beer brewed with spite..

Une Année

A Chicago Brewery focused on making great beer with an emphasis on Belgian and French styles. .

Veteran Beer Company

A company that brews superior quality beer while striving to employ veterans in every role in the organization..

If you know of a brewery that is worthy, and is not listed, add it to the comments below. If one of the breweries listed is kick-ass, please also let us know so we can prioritize the location as part of planning.

After looking through all these killer beers, I'm even more pumped for #APIStrat now. See you in Chicago!!

See The Full Blog Post

Getting To Know Markus Lanthaler For The API Craft 2014 Detroit Hypermedia Panel

I'm preparing for my hypermedia panel with Mike Amundsen (@mamund), Mike Kelly (@mikekelly85), Steve Klabnik (@steveklabnik), Kevin Swiber (@kevinswiber), Jørn Wildt (@JornWildt), and Markus Lanthaler (@MarkusLanthaler), at API Craft Detroit next week. I wanted to go into the panel with a snapshot, and at least a minimal understanding of each of the panelists. This is kind of an all-star panel of hypermedia experts, so I need to at least bump up my understanding of what they are contributing to the API space, and who they are, beyond what I know from my own interactions with these API leaders.

As I do with all of my research, I wanted to share my work with you, my reader. Next up is Markus Lanthaler. I knew of JSON-LD from work I was doing in the federal government, around making government services available, before I knew Markus. I had the pleasure of meeting him when he spoke at API Strategy & Practice in Amsterdam, as well as share the stage with him in Germany at API Days

Here is the outline of my research into Markus's work:

Markus Lanthaler

Using JSON-LD, Hydra, and Schema.org to build awesome Web APIs

Title: Developer, Consultant, W3C Invited Expert
Mission: Working on JSON-LD and Hydra to make Web APIs more fun



JSON-LD is a lightweight Linked Data format. It is easy for humans to read and write.






Supporting JSON-LD:


  • JSON-LD NPM Package - A JSON-LD Processor and API implementation in JavaScript.


Hydra is an effort to simplify the development of interoperable, hypermedia-driven Web APIs. The two fundamental building blocks of Hydra are JSON‑LD and the Hydra Core Vocabulary.



  • HydraBundle - a bundle for Symfony2 to create Web APIs based on Hydra
  • HydraConsole - a generic API console for Hydra-powered Web APIs
  • HydraClient - a PHP client library to access Hydra-powered Web APIs
  • JsonLD - a JSON-LD processor implemented in PHP






I'm not posting all of this information just so I can share my research, it is also because Markus has an important vision of where we should take API design, and how we should be linking our most valuable data. While I learned a lot through this process, I will also use it as a reference for my panel at API Craft, and for other stories I write in the future.

I've already added Hydra and JSON-LD as a tools in my hypermedia API research. I could spend days going through this research, but I also have 2 ther hypermedia API experts to profile, so I'm going to move on to the others, and come back to my profile of Markus in the future to continue my own hypermedia education.


See The Full Blog Post

Getting To Know Mike Amundsen For The API Craft 2014 Detroit Hypermedia Panel

I'm preparing for my hypermedia panel with Mike Amundsen (@mamund), Mike Kelly (@mikekelly85), Steve Klabnik (@steveklabnik), Kevin Swiber (@kevinswiber), Jørn Wildt (@JornWildt), and Markus Lanthaler (@MarkusLanthaler), at API Craft Detroit next week. I wanted to go into the panel with a snapshot, and at least a minimal understanding of each of the panelists. This is kind of an all-star panel of hypermedia experts, so I need to at least bump up my understanding of what they are contributing to the API space, and who they are, beyond what I know from my own interactions with these API leaders.

As I do with all of my research, I wanted to share my work with you, my reader. So, first up is Mike Amundsen. I'm very aware of Mike's presence in the space, but after doing just a couple hours of refresh on what he's been up to, I'm blown away by the leadership he has brought to how we communicate with APIs. 

Let's dive in, here is the outline of my research into Mike's work:

Mike Amundsen

Title - Director of API Architecture, API Academy at CA Technologies
Mission - Improve the quality and usability of information on the Web.


  • we need to ely on hypermedia formats
  • focusing on high degree of shared understanding
  • significant contributions to API space with Collection+JSON, Uber, and ALPS
  • teaching us to communicate in a structured way

Mike brings a significant amount of work to the API sector. When you look at Mike's work, you realize how much time he has given to the sector. I picked three significant contribiutions to focus on for my panel, and ongoing research.


Collection+JSON is a JSON-based read/write hypermedia-type designed to support management and querying of simple collections.

Key Links:




The Uber message format is a minimal read/write hypermedia type designed to support simple state transfers and ad-hoc hypermedia-based transitions. This document describes both the XML and JSON variants of the format and provides guidelines for supporting Uber messages over the HTTP protocol.



  • Keep the message structure as lean as possible. 
  • Support all the H-Factors in hypermedia controls. 
  • Be compatible with multiple protocols (e.g. HTTP, CoAP, etc.) 
  • Maintain fidelity for more than one base message format (XML, JSON, etc.)



ALPS (Application-Level Profile Semantics)

The purpose of Application-Level Profile Semantics (ALPS) is to document the application-level semantics of a particular implementation. This is accomplished by describing elements of response representations for a target media type. For example identifying markup elements returned (i.e. semantic HTML ala Microformats) and state transitions (i.e. HTML.A and HTML.FORM elements) that advance the state of the current application.



  • Design a document format for describing hypermedia interfaces for use in public distributed network applications. 
  • Discover Web developers' common assumptions when building Web client and server applications. 
  • Explore the challenges of designing and implementing client and server applications for the Web that can independently evolve over time.





Beyond Collection+JSON, UBER, and ALPS, Mike is pretty accomplished when it comes to authoring books on the subject, speaking, and even producing his own event.




I'm not posting all of this information just so I can share my research, it is also because Mike is a leader in the API space, and I want to better understand the role he plays, while also helping you understand along the way. While I learned a lot through this process, I will also use it as a reference for my panel at API Craft, and for other stories I write in the future.

I will also be adding Collection+JSON, UBER, and ALPS as tools in my hypermedia API research. I could spend days going through this research, but I also have five other hypermedia API experts to profile, so I'm going to move on to the others, and come back to my profile of Mike Amundsen in the future to continue my hypermedia education.

See The Full Blog Post

Chief Data Officer Needs To Make The Department Of Commerce Developer Portal The Center Of API Economy

Today, the U.S. Secretary of Commerce Penny Pritzker (@PennyPritzker), announced that the Department of Commerce will hire its first-ever Chief Data Officer. I wanted to make sure that when this new, and extremely important individual assumes their role, they have my latest thoughts on how to make the Department of Commerce developer portal the best it possibly can be, because this port will be the driving force behind the rapidly expanding API driven economy.

Secretary Pritzker does a pretty good job of summing up the scope of resources that are available at Commerce:

Secretary Pritzker described how the Department of Commerce’s data collection – which literally reaches from the depths of the ocean to the surface of the sun – not only informs trillions of dollars of private and public investments each year and plants the seeds of economic growth, but also saves lives.

I think she also does a fine job of describing the urgency behind making sure Commerce resources are available:

Because of Commerce Department data, Secretary Pritzker explained, communities vulnerable to tornadoes have seen warning times triple and tornado warning accuracy double over the past 25 years, giving residents greater time to search for shelter in the event of an emergency.

To understand the importance of content, data and other resources that are coming out the Department of Commerce, you just have to look at the list of agencies that are underneath Commerce, who already have API initiatives:

Then take a look at the other half, who have not launched APIs:

The data and other resources available through these agencies, underneath the Department of Commerce, reflect the heart of not just the U.S. economy, but the global economy, which is rapidly being driven by APIs, which are powering stock markets, finance, payment providers, cloud computing, and many other cornerstones of our increasingly online economy.

Look through those 13 agencies, the resource they manage are all vital to all aspects of the economy telecommunications, patents, weather, oceans, census, to other areas that have a direct influence on how markets work, or don't work.

I’m all behind the Department of Commerce, hiring a Chief Data Officer (CDO), but my first question is, what will this person do? 

This leader, Secretary Pritzker explained, will oversee improvements to data collection and dissemination in order to ensure that Commerce’s data programs are coordinated, comprehensive, and strategic.

Yes! I can get behind this. In my opinion, in order for the new CDO to do this, they will have to quickly bring all of the agencies /developer program up to a modern level of operation. There is a lot of work to be done, so let's get to work exploring what needs to happen. 

A Central Department of Commerce Developer Portal To Rule Them All
Right now the Department of Commerce developer portal at commerce.gov/developer, is just a landing page. An after thought, to help you find some APIs--not a portal. The new CDO needs to establish this real estate as the one true portal, which provides the resources other agencies will need for success, while also providing a modern, leading location for developers of web, mobile, Internet of things applications, and data journalists, or analyst to come to find the data they need. If you need a reference point, go look at Amazon Web Services, SalesForce, eBay or Googe’s developers areas—you should see this type of activity at commerce.gov/developer.

Each Agency Must Have Own Kick Ass Developer Portal
Following patterns set forth by their parent, Department of Commerce portal, each agency underneath needs to posses their own, best of breed developer portal, providing the data, APIs, code, and other resources, that public and private sector consumers will need. I just finished looking through all the available developer portals for commerce agencies, and their is no consistency between them in user experience (UX), API design, or resources available. The new CDO will have to immediately get to work on taking existing patterns from the private sector, as well as what has been developed by 18F, and set a establish common patterns that other agencies can follow when designing, developing and managing their own agencies developer portal.

High Quality, Machine Readable Open Data By Default
The new CDO needs to quickly build on existing data inventory efforts that has been going on at Commerce, making sure any existing projects, are producing machine readable data by default, making sure all data inventory is available within their agency's portal, as well as at data.gov. This will not be a one time effort, the new CDO needs to make sure all program and project managers, also get the data steward training they will need, to ensure that all future work at the Department of Commerce, associated agencies, and private sector partners produces high quality, machine readable data by default.

Open Source Tooling To Support The Public And Private Sector
Within each of the Commerce, and associate agency developer portals, there needs to be a wealth of open source code samples, libraries and SDKs for working with data and APIs. This open source philosophy, also needs to be applied to any web or mobile applications, analysis or visualization that are part of Commerce funded projects and programs, whether they are from the public or private sector. All software developed around Commerce data, and receive public fundeing should be open source by default, allowing the rest of the developer ecosystem, and ultimately the wider economy to benefit and build on top of existing work.

Machine Readable API Definitions For All Resources
This is an area, that is a little bit leading edge, even for the private sector, but is rapidly emerging to play a central role in how APIs are designed, deployed, managed, discovered, tested, monitored, and ultimately integrated into other systems and applications. Machine readable API definitions are being used as a sort of central truth, defining how and what an API does, in a machine readable, but common format, that any developer, and potentially other system can understand. Commerce needs to ensure that all existing, as well as future APIs developed around Commerce data, possess a machine readable API definition, which will allow for all data resources to be plug and play in the API economy.

Established An Assortment Of Blueprints For Other Agencies To Follow
The new Commerce CDO will have to be extremely efficient at establishing successful patterns that other agencies, projects and programs can follow. This starts with developer portal blueprints they can follow when designing, deploying and managing their own developer programs, but should not stop there, and Commerce will need a wealth of blueprints for open source software, APIs, system connectors, and much, much more. Establishing common blueprints, and sharing these widely across government will be critical for consistency and interoperability--reducing the chances that agencies, or private sector partners will be re-inventing the wheel, while also reducing development costs.

Establish Trusted Partner Access For Public And Private Sector
Open data and APIs, do not always mean publicly available by default. Private sector API leaders have developed trusted partner layers to their open data and API developer ecosystems, allowing for select, trusted partners greater access to resources. An existing model for this in the federal government, is within the IRS modernized e-file ecosystem, and the trusted relationships they have with private sector tax preparation partners like H&R Block or Jackson Hewitt. Trusted partners will be critical in Department of Commerce operations, acting as private sector connectors to the API economy, enabling higher levels of access from the private sector, but in a secure and controlled way that protects the public interest.

Army of Domain Expert Evangelists Providing A Human Face
As the name says, Commerce spans all business sectors, and to properly "oversee improvements to data collection and dissemination in order to ensure that Commerce’s data programs are coordinated, comprehensive, and strategic”, the CDO will need another human layer to help increase awareness of Commerce data and APIs, while also supporting existing partners and integrators. An army of evangelists will be needed, possessing some extremely important domain expertise, across all business sectors, that Department of Commerce data and resources will touch. Evangelism is the essential human variable, that makes the whole open data and API algorithm work, the new CDO needs to get to work writing a job description, and hiring for this army—you will need an 18F, but one that is dedicated to the Department of Commerce.

Department of Commerce As The Portal At The Center Of The API Economy
The establishment of CDO at the Department of Commerce is very serious business, and is a role that will be central to how the global economy evolves in the coming years. The content, data, and digital resources that should, and will be made available at commerce.gov/developer and associated agencies, will be central to the health of the API driven economy.

Think of what major seaports have done for the the economy over the last 1000 years, and what role Wall Street has played in the economy over the last century—this is the scope of the commerce.gov/developer portal, which is ultimately the responsibility of the new Department of Commerce Chief Data Officer.

When the new CDO get started at the Department of Commerce, I hope they reach out to 18F, who will have much of what you need to get going. Then sit down, read this post, as well my other one on, An API strategy for the U.S. government, and once you get going, if you need any help, just let me know—as my readers know, I’m full of a lot of ideas on APIs.

See The Full Blog Post

An API Definition As The Truth In The API Contract

One conversation I had at #Gluecon this year, was around the role an API plays in being a contract between providers and consumers, with Tony Tam (@fehguy) from Reverb. API contract, is a common phrase to describe how API services are consumed, and depending on the on-boarding process, an API provider and consumer can enter into a contract for services around a set of resources, in a self-service way.

In the last couple years, with the increased use of API definition formats like API Blueprint, Swagger, and RAML, we often reference this API definition as a tangible representation of the contract API providers and consumers enter into. In my mind, I see the API definition as one building block, in a larger set of building blocks, that are working together to form a contract, with the API definition acting as the truth.

If you step back and look across multiple API providers, you start to see a variety of building blocks that contribute to the overall "API Contract", that is negotiated between API provider and consumer, starting with the API definition.

A machine readable definition of an API interface, using a common format like API Blueprint, Swagger or RAML, providing a definition of the surface area of the resources that are available via API. API definitions are proving to be very useful in establishing a common way to describe, communicate and collaborate around APIs, which are often extremely abstract.

Terms of Service
Terms of service are the legal portion of the API contract, determining how the service can be used, keeping the company’s interest in mind, but should also be liberal enough to allow developers and 3rd party integrators to be successful. Plain english version of terms of service go a long way in seamlessly fitting into the API contract, allowing API consumers to quickly understand where they stand.

The privacy of a platform, is definitely a very critical element of the API contract, and sets the tone for a platform operations, contract with developers, and their promise to the end-users. An awareness, and respect of privacy issues is part of every success API platform, and feeds into the overall health of the contractrual relationship.

Service Level Agreement
A service level agreement (SLA), sets the quality of service expectations within the API contract. All API consumers should have some sort of SLA that provides a definition of the services being offered, and the level of service, and the reliability end-users can expect as part of API consumption.

Service Accord
As an API provider, if you can’t offer a formal SLA, you should at least provide a service accord with users, setting expectations around what level of service will be offered around an API, and while a service accord isn’t legally binding, it can at least help set expectations, and the tone of API operations for consumers.

Interface License
The Oracle v Google legal battle has brought API licensing front and center, making them part of the overall contract of API operations. Whether you apply copyright, or some other license, an API license will be part of the contract entered into between API provider and consumer.

Data License
Like an API interface, the licensing of underlying data stored, and returned by an API needs to have a license defined, providing guidance for developers, and businesses that are consuming an APIs data, and how they can store, remix, and put data to use.

Code License
Providing code in the form of samples, libraries, and SDKs is common practice for API providers, and the how the code is licensed will determine how API consumers integrate an API into their, providing another layer of licensing within the API contract. This doesn't end with client side code, many APIs will also offer server side API implementations, which should also be appropriately licensed.

Deprecation Policy
How long can API consumers depend on an API? What sort of notification will I get before an API is deprecated? The deprecation policies of an API, play a significant role in the trust established as part of the API contract that API providers and consumers agree to.

Contracts are know for targeting a specific period of time, but API contracts are considered to be open ended, and part of that obligation is communication and openness around an APIs roadmap, providing critical details on what is next for an API, allowing API consumers to trust that a contract will not change to quickly.

Providing a place where API consumers can see a recent history of changes to an API platform offers a reference, for how often an API changes, and how its evolved over time, balancing out marketing and potential rhetoric around API operations.

Rate Limits
As an API consumer, if we are entering into a contract, I need to know how my contract will scale, and where I exist in the API supply chain. Rate limits provide a unit of measurement that can be used as part of the API contract, quantifying expectations and deliverables.

Uptime / Availability
You can offer a low cost, high value API service, but if your availability and uptime sucks, you are not delivering on the contract entered into with API consumers, making this aspect of operations a very important piece of the API contract.

Stipulating the price paid for a service is fundamental to any contract, and the pricing of a particular API represents the value that is exchanged between API provider and consumer, as part of the API contract.

Service Tiers
Most modern APIs have multiple access levels, providing contract negotiation opportunities between API provider and consumer, a feature that allows API contracts to be negotiated, or re-negotiated in real-time, based upon known service tiers.

A contract is a relationship between two businesses, and any relationship requires support to be successful. Direct, and indirect API support provides assurances to API consumers that once they enter into a contract, the API provider will be there to help in all phases of contract execution.

An API definition provides a tangible, machine readable lens to look at the API contract through, providing a quantifiable truth that describes how an API will be delivered, but the actual API contract requires a suite of other supporting elements, starting with terms of service, and continuing through to how a company supports an API, before the API contract is fully realized.

I'm very optimistic about the evolution of API design, deployment, and management in 2014, now that we have the ability to define an API, and quantify the resources interface in a machine readable format. Next we need the same for the other portions of the "API contract", so that we can negotiate contracts for online resources in an automated, frictionless, real-time way.

Photo Credit: Michela Tannoia

See The Full Blog Post

Evolving How We Approach The API Lifecycle With APIMatic

I’ve expanded my monitoring on the world of APIs, from just API management, which I’ve been doing for four years, into tracking on APIs across multiple buckets I'm calling design, deployment, management, monetization, evangelism, discovery, integration, aggregation, reciprocity, and real-time. I am always working to understand who the key players are across the API space, but also make sure they are categorized into one, or many of these expanding buckets--helping me quantify things.

It is always very interesting to see how an API service provider fits into more than one of these buckets, as well as when new players emerge to cater to just one of these buckets, like Apiary did with API design. Playing on this theme, I was introduced to new a new API service provider, called APIMatic the other day, who on the surface seems to cater to API providers with the automatic generation of SDKs, but really is a cross-over into API design, discovery, and integration, bringing a new perspective to the table.

Generating SDKs Is The Carrot
As soon as you land on the APIMatic home page, they state very clearly what they do "Automatic SDK Generation for APIs”. You can search common APIs that are available in the APIMatic API marketplace, import your own API definition, or build one using their web-based user interface—which pretty squarely makes APIMatic for API consumers, providing clear API integration benefits, but via the generation of SDKs.

GUI Tool For API Design
The third option for generating an SDK from an API, is using the APIMatic web-based user interface, which allows you to build a definition of your API, using a web interface—no coding necessary. You can create a new definition, manage its settings, define endpoints, and the underlying data model. When ready, you can generate your SDK, which is rendered, I’m guessing using the default APIMatic format, and then also allows you to generate mock APIs, and sandbox environments--pretty squarely in the world of API design for me.

Provides API Discovery
Then the first option for generating an SDK is from a marketplace of existing APIs, allowing developers to generate SDKs for the most common APIs they depend on. From an API provider standpoint, this is a huge incentive for generating machine readable API definitions like Swagger, and register with marketplaces like APIMatic. I’d say APIMatic starts with a very Mashape style API marketplace for discovery, but then quickly focuses heavily on quality SDKS, and API integration as the end deliverable.

Supporting API Integration
Since APIMatic generates the code that sits between your app and the API resources it depends on, it has a unique lens for looking into how your applications are using APIs. This provides an alternative approach to the proxies and tooling I'm seeing emerge to monitor, track, test, and report on API integration. I’m not sure of the pros and cons to this type of API integration, but think APIMatic vision is worth taking a closer look at.

Mobile Focused Software Development Kits (SDK)
APIMatic focuses on delivering mobile SDKs, for iOS, Android, Windows, and Java platforms. When I talked to the founders, they said they would eventually provide more web focused SDKs, but mobile is obviously a major driver of API consumption, and was the low hanging fruit.

Centered Around Machine Readable API Definitions
Everything about APIMatic is centered around API definitions. When you select to import an API, you are given the option to import APIMatic, WADL, Swagger, IODocs, GoogleREST, and MashapeJSON. Each of the APIs available in marketplace have a machine readable definition, as well as the web-based form builder generating a machine readable API definition as well. These definitions are then used to generate SDKs, mock interfaces, and full circle back to listing you in the APIMatic marketplace for discovery—if you wish to make your API public.

Quality Software Development Kits (SDK)
Another aspect I find interesting about APIMatic, is that they focus on not just generating code stubs, they focus on high quality SDKs. Which in an era where you here a lot of rhetoric around SDKs being poorly written, out of date, hard to maintain, and unecessary--this gives APIMatic a potentially important differentiator if they do it right. APIMatic takes pride in their code being well written, looking nice, and following good conventions--something that could go a long ways in benefiting both API providers, and consumers.

Seeing new breeds of API service providers continue to emerge, with even new perspectives on how the API lifecycle operates, makes me happy, and with everything centered around machine readable, common API definitions, I'm even more pleased. This is in sync with what I’m seeing come out of the API design, deployment, management, and integration providers I've been tracking on—that API definitions are increasingly driving all stops along the API lifecycle subway.

I will play with APIMatic some more. To be honest, I was approaching this from an API providers perspective when I jumped on the Google Hangout with them, and was pleasantly surprised when I saw that they crossed over into API design, integration, discovery, as well as management. Now that I have a good idea of what they offer, I will revisit what the benefits to API providers could be. At the very least, it is another carrot for API providers to be sure and generate machine readable definitions for their APIs—without it your APIs won’t be found, or be able to be imported, and drive the next generation of API tooling like APIMatic.

APIMatic is looking to meet API providers, and get your API listed, so if you reach out, make sure and let them know I sent you. If you use this link, and sign up, you will automatically be let into the beta users group--APIMatic is currently “invite only”.

Oh, I also had a good conversation with them about the possibilities around integration of APIMatic with API Commons, and distributed API search engines like APIs.io, using APIs.json—so more to come. Much more!

See The Full Blog Post

Netflix Finally Shutters Support For Public API

Netflix officially announced they will be ending support for their public API. Its no surprise, as they announced early in 2013 that they would longer accept new registrations for the API.

While I think that Netflix could have put more resources into their API, and fought harder to make their public API a success, I still consider Netflix to be one of API pioneers that we can learn from when crafting our own API strategy.

While the public Netflix API was not a success, the internal and partner API strategy at Netflix was a success. APIs have allowed the company to scale into the cloud, grow internationally, and expand to sever over 1000 devices via their trusted partner network.

In addition to the internal API success at Netflix, they have been amazing at sharing their knowledge and experience with the wider API community via their blog, conferences like API Strategy & Practice, and in books like APIs: A Strategy Guide, available on O'Reilly Publishing, written by Daniel Jacobsen, Greg Brail and Dan Woods.

Another positive byproduct of Netflix API operation, is that the company has been prolific in open sourcing the technology that goes into the API stack. When you visit the Netflix Github account, you will find a wealth of open tooling that they have worked hard on, and opened up for public use.

API success varies widely from company to company, sector to sector, and it doesn't always look like you think it will—this is part of the API experience. Just because Netflix doesn't reflect one vision of API success that open developers believed in, it doesn't mean Netflix as a whole was an API failure. There are plenty of lessons to learn from their public API failure, as well as their internal API success.

See The Full Blog Post

The edX API

This post should tell you about how behind I am in my storytelling—this story is from an event I attended in Arlington TX, on April 30th, and May 1st. While in Arlington, I spoke to a group of professionals who were crafting an online data & analytics course. A couple of the participants were from edX, the online course platform partnership between MIT, Harvard, UC Berkeley and other universities.

Over the course of two days, I had a cance ask the question, where was the edX API? Seemed like an obvious question, to which Emily Watson, the program manager at edX, responded, “Its on our roadmap”! An answer I get from many online companies, but Emily pulled up their roadmap on the wiki, and indeed it was on their roadmap.

I told Emily, I’d review the edX platform, and provide some feedback, that they could incorporate into their strategy. This one is easy, and is my basic feedback for any company with an online website and/or application—you make everything currently available on your site or application, available via an API. A quick glance at edX, these would be:

  • Courses - The 175+ courses available on edX, should also be available in a machine readable format, via the edX API.
  • Faculty & Staff - The 400+ faculty and staff involved in producing edX courses should be available in a machine readable format, via the edX API.
  • Schools & Partners- All edX partners should be available in a machine readable format, via the edX API.

You start with the low hanging fruit, by establishing three, simply designed web APIs for those already, publicly available resources, then move on to providing the essential business building blocks for any API:

  • Dedicated API Portal - Simple, dedicated portal for edX API integrations, that isn’t just for developers. It should be easy enough for anyone to learn about the edX API, without too much technical detail front and center.
  • Simple Landing Page - The home page of the edX API portal should be dead simple, explaining what the API does, providing easy one click access to get to whatever any API, or potential API consumer will need.
  • Self-Service Registration - The edX API needs to be accessible 24/7, and developers should be able to register without approval, and at least get minimum access to the system to begin playing with resources, even if it takes approval to get higher levels of access. Modern API management solutions like 3Scale work very well in managing this layer of access, and service composition.
  • Interactive Documentation - The standard for all APIs in 2014 is to provide interactive documentation, making learning about an API, a hands on experience. There are some common approaches to defining an API using machine readable formats like Swagger, which will automatically generate interactive documentation for consumers.
  • Code Samples & Libraries - Developers need all the help they can in getting up and running with an API, and right after interactive API documentation, code libraries and samples are critical to onboarding API consumers.
  • Blog W/ RSS - An active blog, with RSS for syndication provides the necessary stories and resulting SEO that helps communicate the value an API offers, and keeps new and existing consumers in tune with API operations.
  • Twitter Presence - An active Twitter presence is common place for leading API platforms, and edX will need an active Twitter account to support its other communications, and support operations.
  • Support Ticket System - An easy to use, trackable support system for the edX API platform will be essential to establishing a feedback loop with API consumers. Github issue management works very well, for an out of the box, API support ticket system for all public APis.
  • Discussion Forum - Discussion forums are common place in APIs, and provide potential indirect, community support, that can help new and existing API consumers find what they are looking for. Don’t limit yourself to a local discussion forum, there are SaaS solutions, as well as existing developer driven communities like Stack Overflow.

Next make sure and cover the basic political building blocks:

  • Terms of Service - Provide simple, liberal terms of service that incentivize integrations and development.
  • Content Licensing - Make sure all licensing for any content is explained front and center.
  • Code Licensing - Clearly license all client side code samples, libraries and event potentially server side code for the API.
  • API Licensing - Make sure the API definition is openly licensed, so that it can be re-used, re-mixed in many ways--consider putting into API Commons.
  • Branding Requirements - Provide clear direction when it comes to branding requirements around API integrations.
  • Rate Limits - Help developers understand what rate limits are in place to keep system stable, and what ways there are to get more access.
  • Pricing - Are there any costs for API access? If there are, make sure and clearly explain what is free, and at what points do resources cost.
  • Partner Tiers - Establish multiple layers of API access, allowing trusted partners to achieve higher rate limits, and write access to resources.

That should do it! ;-) As you can see the APIs themselves are just the starting point. It is too late for edX to be API-first, which would have been much easier, but you have just get your basic APIs, the supporting business and political building blocks, and get down to the hard work of evangelizing, managing, and iterating on the edX API. The learning doesn't start until you have the API up, being consumed, and have engage users who are providing feedback for the roadmap.

Once a basic edX API is up and running, and edX has some experience under their belt in managing the API community, then some thought can be put into creating a read / write API, allowing access to student information via oAuth, as well as letting trusted partners publish content, and student data into the system. A read / write system could be done in a single release, but since edX is a little behind curve on getting API up, I recommend cutting teeth on a read only system for version 1.0.

In reality, this advice for designing, deploying, managing, and evangelizing an API applies to any company with an online presence. It is a pretty proven formula extracted from watching multiple API leaders like Twitter, Google and Amazon. In this particular scenario, edX needs to get on the ball--it is difficult for me to imagine being an online education platform that operates between multiple universities in 2014, and not having an active API. ;-)

See The Full Blog Post

Another Strong API Implementation In Federal Government With OpenFDA

I am really impressed with the quality of API deployments coming out of the federal government recently. I wrote about the FBOpen API from 18F a couple months ago, and the latest is the OpenFDA API from the Food & Drug Administration. I’ve been watching the rollout of the API from behind the scenes for a while now, but with all my travel and speaking I haven't had time to write about or participate, but now that they've officially launch publicly, I wanted to help showcase what they've been up to at the FDA.

Meaningful First Impression
When you first land on OpenFDA, you immediately understand what it does, thanks to the interactive visual on the homepage introducing OpenFDA, letting you know that it contains more than 3 million adverse drug event reports, with frequently reported indications for drug use among women, 55 to 90. This simple simple description, combined with an interactive visual that demonstrates the value contained within this government API resource, leaves a meaningful first impression upon arrival.

Not Just Talk Of Being Open
We’ve misused the word open when it comes to APIs, so that I’m always skeptical when I see it use, but not in the case of OpenFDA. At the top of the home page, it gives three distinct examples of how OpenFDA embraces open, with data, code, and in community, and through making the API openly accessible, simply by requesting a key, in exchange for your email address.

Explaining What It Is All About
OpenFDA explains what the OpenFDA is all about with a detailed about page, telling what the API does, who worked on the project, and how you can get involved. This type of background is often overlooked by API providers, requiring API consumers to have to piece together what the big picture is around an API—not with OpenFDA. On-boarding with the API starts with an overview of the project, and resulting API, then dropping you into what is needed to get signed up and begin using the API.

Key Based Authentication
All that is required to get up and running using the OpenFDA API, is a valid email address, which in return you get an API key that you can use to make all API calls. Of course there are terms of service (TOS) and rate limit restrictions on each key, but this is standard operating procedure for APIs, especially one still in beta.

Deployed Using ElasticSearch
Where the FBOpen from 18F used Apache Solr to deploy their API, OpenFDA uses search and analytics platform ElasticSearch to deploy their API. I think both of these approaches reflect an interesting trend in government, which deploys APIs from existing, an sometimes messy data stores, allowing a more meaningful and useful layer to be added with very little effort.

Powerful Search Capabilities
One of the benefit of using existing search solutions like ElasticSearch, is that you get some pretty sophisticated search tools, with very little work. OpenFDA starts with providing the basic query parameters for search, then also adds in query syntax, exact matches, grouping, dates, and ranges—providing very powerful search capabilities out of the box.

Necessary Terminology To Get Started
When using the OpenFDA API, you are accessing drug events, which will require some necessary terminology to get up to speed on the world of pharmaceuticals, unless of course this is the world you already live in. OpenFDA provides the necessary, to get anyone up to speed on the terminology needed to understand what is contained the 3 million drug event reports.

Interactive API Documentation With Visuals
Interactive API documentation, allowing API consumers to make live API calls while learning about an API, is fast becoming essential for any API. OpenFDA provides interactive documentation, making learning about the OpenFDA API a hands-on experience, but takes this ones step further by providing supporting data visualizations. I’ve seen a lot of interactive API documentation, and I don’t believe I've ever seen real-time visualizations to go with documentation—something I will explore further in a separate story.

Essential Communication Channels
OpenFDA doesn't miss a beat in establishing the required communication channels for the API, providing updates on the platform in a blog format, available directly from the home page. Additionally OpenFDA employs Twitter, providing a real-time conversation around the valuable API resource, between the FDA and 3rd party consumers. Open, active communication channels, providing two-way communication between a platform and its consumers, is one of the essential ingredients that make this whole API thing work.

Essential Support Framework
Building on top of OpenFDA platform updates, and Twitter communication channels, OpenFDA provides a multi-tiered support framework, allowing API consumers to ask questions on StackExchange, report bugs via Github, and send feedback via email. The OpenFDA support framework plus the open communication channels, establishes a robust feedback loop for the FDA around drug event report data.

Active Github Presence
Github is an essential building block for all government APIs in my opinion, and OpenFDA agrees. You can find code samples for OpenFDA in a Github repository, submit bug reports via the Github issue management, and even the developer portal for the OpenFDA API runs using Github Pages, and underlying repository--nice start for your Github account OpenFDA.

Realizing This Is More Than Just Tech
Showcasing the idea that APIs are way more than just the underlying tech is what API Evangelist is all about, and in my opinion OpenFDA gets this 100% with their API implementation. The OpenFDA technical implementation is built using existing technology, on top of valuable open government data, providing the necessary technical building blocks to be considered a modern web API. However, OpenFDA is so much more, delivering the best first impression possible, fritctionless on-boarding, and the required education you will need to put OpenFDA to work. OpenFDA also provides the necessary feedback loop to ensure an APIs success, with updates, Twitter, StackExchange, Github, and good ol email, there will be the necessary discussion around OpenFDA that is required move things forward.

The OpenFDA API is in beta, but already posses many of the essential building blocks for a successful API. I’m sure with some hacking I could find room for improvement in the API, and supporting operations. Something I will do as I have time, adding to the existing feedback I've been seeing from other beta users. For now, I’m impressed with the release, and when you bundle alongside other API initiatives like FBOpen and We The People, it makes me very optimistic about what APIs are going to be able to accomplish in federal government.

See The Full Blog Post

API Evangelism

API Evangelism

An API is useless if nobody knows about it. Evangelism has emerged as the approach to selling, marketing and support an API platform. While the intent of evangelism can be sales and marketing, the philosophy that has proved successful is to find a balance that is more about focusing on API support and engagement with consumers over sales.

A healthy API evangelism strategy brings together a mix of business, marketing, sales and technology disciplines into a new approach to doing business.

Healthy API evangelism is centered around clear goals. Goals usually start with targets like new user registration, but need to be set higher around active API consumers, expanding how your existing users consume your API resources, all the way to clear definition of how your API will extend and expand your brand. 

Consumer Engagement
While it may seem obvious, actively engaging API consumers often gets lost in the shuffle. Have a strategic approach to reaching out to new users in a meaningful way, establishing healthy practices for reaching out to existing developers at various stages of integration, is essential to growing an API initiative. Without planned engagement of API consumers, a canyon will grow between API provider and API consumer, one that may never be able to be reversed.

An active blog, with an RSS feed has the potential to be the face of an API and developer evangelism campaign. A blog will be the channel you tell the stories that help consumers understand the value that an API delivers, how other developers are integrating with it, ultimately leaving an SEO exhaust that will bring in new consumers. If comments are in place, a blog can also provide another channel for opening up conversation with API consumers and the public. 

Without an understanding of the industry an API is operating in, an API will not effectively serve any business sector. By establishing and maintaining a relevant keyword list, you can monitor competitors, companies that compliment your platform, and establish an active understanding of the business sector you are trying to serve. Regular monitoring and analysis of the business landscape is necessary to tailor a meaningful API evangelism campaign. 

When it comes to evangelism, support is one of the most critical elements. There is no better word of mouth for an API, than an existing consumer talking about how good the API is, and the support. Engage and support all API consumers. This will drive other vital parts of API evangelism, including creating positive stories for the blog, healthy conversations on social networks and potentially creating evangelists within a community.

I recommend a lot of online services and tools for API providers and consumers to put to use. But there is not any single platform that delivers as much value to the API space as Github. I would put AWS as close second, but Github provides a wealth of resources you can tap when both providing APIs or building applications around them.  Github is a critical piece of any API strategy, allowing social relationships with developers that is centered around code samples, libraries or even documentation and resources for an API.

Social Networking
Twitter, Facebook, LinkedIn, Google+ and Github are essential to all API evangelism strategies. If an API does not have a presence on these platforms, it will miss out on a large segment of potential API consumers. Depending on the business sector an API is targeting, the preferred social network will vary. Providing an active, engaging social support presence when operating an API is vital to any API ecosystem. 

Social Bookmarking
Discovery and curation of bookmarks to relevant news and information via social bookmarking platforms is essential to an active API evangelism strategy. Using Reddit, Hacker News and StumbleUpon will provide discovery and access to a wealth of resources for understanding the API space, but also provide an excellent channel for broadcasting blog posts, news and other resources about API operations, keeping consumers informed, while also opening up other opportunities for discovery. 

API providers, and API consumers are constantly building trust and establishing a long term relationship with each other. One key facet of this trust, and the foundation for the relationship is sharing a common road-map. API providers need to actively involve API consumers with where the API resources are going, so that consumers can prepare, adjust and even give feedback that may, or may not, influence the road-map. Nothing will piss off API consumers faster than keeping them in the dark about what is coming down the pipes, and surprising them with changes or breaks in their applications. 

A healthy online presence is critical to any successful API strategy, but giving attention to a strong in-person presence at events is also a proven tactic of successful API providers. Evangelism involves a coordinated presence at relevant conferences, hackathons and local meetups. Events are necessary for building personal relationships with partners and API consumers that can be re-enforced online.

Measuring every aspect of an API operations is necessary to understand what is happening in any API operations. Reporting on every aspect of API operations is how you visualize and make sense of some often, very fast moving API activity. It is important to quantify API operations, and develop reports that are crafted to inform key stakeholders about an API initiative.  

External facing activities will dominate any active API operations. However, an essential aspect of sustainable API programs is internal evangelism. Making sure co-workers across all departments are aware and intimate with API operations, while also informing management, leadership and budget decision makers is critical to keeping API doors open, healthy and active. 

API and developer evangelism is an iterative cycle. Successful API operations will measure, assess and plan for the road-map in an ongoing fashion, often repeating on a weekly and monthly basis to keep cycles small, reducing the potential for friction in operations and minimizing failures when they happen.

A healthy API evangelism strategy will be something that is owned partially by all departments in a company. IT was a silo, APIs are about interoperability internally and externally.

See The Full Blog Post

Gather Feedback For Your API With UserVoice

I’m looking at new and innovative ways companies are building analytics and visualizations on top of APIs, and one of the new tools I’ve come across is ImpactStory. ImpactStory aggregates altmetrics: diverse impacts from your articles, datasets, blog posts, and more. But this post isn’t about ImpactStory, I’ll crunch what they do and write about in another post.

This post is about their usage of feedback, helpdesk, and knowledge base management tool UserVoice, which is a service I always recommend to API owners looking to support different aspects of their API community.

ImpactStory simply has a “BETA - Send us Your Feedback” image in the top right corner of their site. When you click on the logo, you are presented with a simple UserVoice form for submitting your ideas of where you think ImpactStory should take their platform.

I think this is a pretty dead simple way of soliciting feedback from your community. Involving your users in your roadmap planning can go a long way in building goodwill with them, and encouraging participation and innovation in other ways.

After I spend more time playing with ImpactStory, I’ll do another post on what else they are up to, but at first glance its a pretty interesting approach to developing tracking analytics, visualization and other embeddable goodness using APIs.

See The Full Blog Post