DataFire is an open source framework for building and integrating APIs. It provides over 600 integrations, including:
AWS • Azure • MongoDB • Slack • GitHub • Twilio • Trello • Square • Google Sheets • Gmail • Heroku
Each integration provides a set of composable actions. New actions can be built by combining existing actions, JavaScript, and external libraries. They are driven by JavaScript Promises, and can be triggered by a URL, on a schedule, or manually.
Want more? DataFire.io provides a simple interface for building, managing, and hosting DataFire projects.
Create an API backed by Google Sheets | Repo | Run on DataFire.io |
E-mail yourself news headlines | Repo | Run on DataFire.io |
Backend for a "Contact Us" form | Repo | Run on DataFire.io |
Sync GitHub issues to a Trello board | Repo | Run on DataFire.io |
Create a Spotify playlist from r/listentothis | Repo | Run on DataFire.io |
Be sure to install DataFire both globally and as a project dependency.
npm install -g datafire
npm install --save datafire
See the full example to learn about input validation, custom HTTP responses, and more.
Let's set up a simple DataFire project that has a single URL, GET /hello
.
We'll need two things: an action, and a trigger.
First we create a new action - the logic that will be run when the URL is loaded:
let datafire = requrie('datafire');
module.exports = new datafire.Action({
handler: input => "Hello, world!",
});
Next we set up a path
trigger in DataFire.yml.
paths:
/hello:
get:
action: ./hello.js
Now we can run:
datafire serve --port 3000 &
# DataFire listening on port 3000
curl http://localhost:3000/hello
# "Hello, world!"
kill $! # Stop the server
Your actions, triggers, credentials, and configuration options are set in DataFire.yml. Here's a sample DataFire.yml
Over 500 integrations are available on npm, under the @datafire
scope.
For example, to install the hacker_news
integration:
npm install @datafire/hacker_news
Each integration comes with a set of actions. For example, the hacker_news
integration
contains the getStories
, getItem
, and getUser
actions.
Check out the usage and authentication documentation to learn more.
Actions come in two varieties:
- actions you build yourself in JavaScript, e.g.
./actions/hello.js
- and actions that are part of an integration e.g.
hacker_news/getUser
You can run actions on the command line:
datafire run hacker_news/getUser -i.username norvig
Or create triggers for them:
paths:
/hn/profile:
get:
action: hacker_news/getUser
input:
username: 'norvig'
Or run them in JavaScript:
var hackerNews = require('@datafire/hacker_news').create();
// Using await (requires NodeJS >= v7.10):
(async function() {
var user = await hackerNews.getUser({username: 'norvig'});
console.log(user);
})();
// Or with Promises:
hackerNews.getUser({
username: 'norvig',
}).then(user => {
console.log(user);
});
Every action has a handler
, which must return a value or a Promise. Actions can also
specify their inputs and outputs (using JSON schema).
Input (but not output) will be validated each time the action is run.
In DataFire, actions are run by triggers. There are three different types of triggers:
paths
- URLs likeGET /hello
orPOST /pets/{id}
tasks
- Jobs that run on a schedule, like "every hour", or "every tuesday at 3pm"tests
- Jobs that can be run manually using thedatafire
command line tool
Each trigger must have an action
, and can also specify the input
and accounts
to pass
to that action.
Paths create URLs that trigger your actions. For example, you can create a URL that returns your GitHub profile:
paths:
/github_profile:
get:
action: github/users.username.get
input:
username: 'torvalds'
If you don't specify the input
field, DataFire will automatically pass either query parameters
(for GET/DELETE/HEAD/OPTIONS) or the JSON body (for POST/PATCH/PUT) from the request to the
action.
Start serving your paths with:
datafire serve --port 3000
You can schedule tasks in DataFire.yml by specifying a rate or cron expression.
tasks:
send_database_report:
action: ./send-db-report.js
schedule: rate(1 day) // or cron(0 0 * * * *)
accounts:
google_gmail: lucy
mongodb: mongo_read_only
A monitor will poll a particular resource for new items, and only run your action if a new item is found. For instance, we can check for new items on Reddit every 5 minutes:
tasks:
watch_reddit:
schedule: rate(5 minutes)
monitor:
action: reddit_rss/frontPage
array: feed.entries
trackBy: link
input:
subreddit: sports
action: ./post-story-to-slack.js
Start running tasks with:
datafire serve --tasks
Tests allow you to save a particular set of inputs and accounts for a given action, so that the action can be run manually with the DataFire command-line tool.
tests:
get_torvalds:
action: github/users.username.get
input:
username: torvalds
get_norvig:
action: github/users.username.get
input:
username: norvig
Run a test with:
datafire test <test_id>
Check out the cookbook for common patterns, including paginated responses and mocking/testing.
Run
datafire --help
ordatafire <command> --help
for more info
datafire serve --port 3000 # Start API server
datafire serve --tasks # Start API server and start running tasks
datafire list # View installed integrations
datafire list -a # View all available integrations
datafire list -a -q news # Search for integrations by keyword
datafire integrate --name petstore --openapi http://petstore.swagger.io/v2/swagger.json
datafire integrate --name reddit --rss http://www.reddit.com/.rss
datafire describe hacker_news # Show info and actions
datafire describe hacker_news/getItem # Show action details
datafire authenticate google_gmail # Store credentials in DataFire-auth.yml
# Run an action
datafire run ./sendMessage.js
# Run integration actions with [integration]/[action]
datafire run github/repositories.get
# Pass parameters with --input
datafire run github/search.repositories.get --input.q java
# Use credentials with --accounts
datafire run github/user.get --accounts.github.access_token "abcde"
Contributions are welcome!
git clone https://github.com/DataFire/DataFire && cd DataFire
npm install
Tests are run with npm test
and require ports 3333-3336 to be open.