Skip to main content

Query Splunk, the easy way, with plain old JavaScript

Sometimes you just need the basics. This post walks you through the simplest, quickest way to query data from Splunk, using plain old JavaScript. There's no 3rd party plug-ins or SDK required and no opinionated frameworks to deal with. 

You will need:
  • A splunk instance (get yours here if you don't have one)
  • An authorisation token
  • Node.JS and npm installed
If you don't have an auth token, request one from your Administrator. If you are an admin, just create a token using the following cURL command in Terminal (replace <HOST> with your host instance). Change the +300d if you want to adjust the time before the token expires. 

curl -k -u UID:PWD -X POST https://<HOST>:8089/services/authorization/tokens?output_mode=json --data name=admin --data audience=Managers --data-urlencode expires_on=+300d 

for example, if your userid was Susan, your password was Wibble! and your Splunk instance was running on acme.com, then you would enter

curl -k -u Susan:Wibble! -X POST https://acme.com:8089/services/authorization/tokens?output_mode=json --data name=admin --data audience=Managers --data-urlencode expires_on=+300d 

Copy your token and keep it safe. There's no way to retrieve it later.
You can test your token works with the following cURL command

curl -k -X POST -s -H "Authorization: Bearer <PUT TOKEN HERE>" https://<HOST>:8089/services/search/jobs -d search="search index=_internal" -d output_mode=json -d exec_mode=oneshot | json_pp

Enough cURL - now we're ready to rock!

Disclaimer: In the interests of being quick and dirty, we are going to encode the host, access token and search string in the app. I feel so dirty, but I'll sacrifice a few principles for your benefit... 

Step 1: Create your node.js app from the command line. For example, if your app is called 'simple', in a new directory called simple, enter 'npm init'.


Step 2: Create a file called main.js and in your favourite editor enter the following: 

// Using core libraries of node.js (no 3rd party npm modules to install)
const https = require('https');
const querystring = require('querystring');

// obvious - but use a Splunk search you've already run in the Splunk app
let mySearchString = 'search index=_internal'; 
// use the token you created in the previous step const token = 'Bearer <TOKEN>'; // set the response to JSON format (it will send XML by default // use the 'oneshot' method to execute the search in one attempt.
// Normal queries would be an asynchronous
// the request would return a Search ID (SID) that you can 
// use to get the results of the search
let postData = querystring.stringify({
    'search': mySearchString,
    'output_mode':'json',
    'exec_mode':'oneshot'
});
// Set up the HTTP request
// fill out the HOST and token field
let options = {
    hostname: '<HOST>',
    port: 8089,
    path: '/services/search/jobs',
    method: 'POST',
    rejectUnauthorized: false,
    requestCert: true,
    agent: false,
    body: postData,
    headers: {
        'Content-Type': 'application/json',
        'X-Requested-By': 'STANDALONE',
        'Content-Length': Buffer.byteLength(postData),
        'Authorization': token
    }
};

// Print out what we have set up
console.dir(options);

// return an instance of the http.ClientRequest class
const req = https.request(options, (res) => {
    // display the response
    console.log(`STATUS: ${res.statusCode}`);
    console.log(`HEADERS: ${JSON.stringify(res.headers)}`);
    res.setEncoding('utf8');
    res.on('data', (chunk) => {
      console.log(`BODY: ${chunk}`);
    });
    res.on('end', () => {
      console.log('All done. No more data in response.');
    });
  });
 
  // Handle any errors
  req.on('error', (e) => {
    console.error(`problem with request: ${e.message}`);
  });
 
  // Write data to request body
  req.write(postData);

  // signify the end of the request
  req.end();

Step 3: Save the file and then from the command line enter 'node main.js' to run the application. If all goes well you should get a JSON response with your search results.




Comments

Popular posts from this blog

Host a static website on Google Drive (in 5 easy steps)

You need to host a static website but don't have the time, money or resources to set up a web server.  Perhaps you're learning to code or just doing a demo. Here's a way to set up a web site at no cost, in just a few minutes. Step 1. Create a new folder in Google Drive. From Google Drive, Click 'Create', select 'Folder' and enter the folder name. (I chose 'hybrid' for this example, but you can choose anything you want). Step 2. Share the folder. First select the folder you created (displayed in the folder list), then click the sharing icon. In the Sharing Settings popup, go to the 'Who has access' section and click 'Change' The Visibility options pop up will appear. Change the Visibility option to 'Public on the web'.  Although set by default, make sure that 'Access' is set to 'Can view'. Click 'Save'. The folder is now shared. Click D

How to get the BBC iPlayer running when you live outside of the UK

(subtext: Get the World's most famous detective on your favourite browser) The new series of Sherlock has started on the BBC. If you live outside of the UK and you are too impatient to wait for your local TV content provider to host it for you - then fear not !! These simple instructions will get you up and running. In addition to the iPlayer you can access most of the other UK TV channels using the same method. Note: you can use the same method to access content in other countries - such as Hulu in the U.S. How it works:  In simple terms, the BBC iPlayer, like other players, perform a check to determine whether your internet access is originating from the UK.  So the trick is to ensure that your access to the BBC website will originate from the UK. First you are going to use a free piece of open source software that was designed to keep your internet access anonymous. You will add a setting that will ensure that the software makes use of servers in the UK whenever makin

Skip the grunt work: Use AI to turn raw data into Slides (Part 1)

Scenario You've got to create a presentation using data from multiple CSV files. Typically, this means merging files into a single spreadsheet, generating charts, and copying everything into your presentation—an absolute time sink! Let’s fix that. Here’s a free, no-code approach using Google Workspace tools and a sprinkle of automation. This is Part 1 of a two-part series. The Problem Simplified: Multiple CSV files in a consistent format (I used stock data). The goal: Combine into one Spreadsheet for easy analysis and charting. How: Instead of manual copy-pasting, we’ll automate the process using Google Apps Script and Chat-GPT for code generation. What You'll Need: Google Apps account: A free Gmail account will work perfectly. Access to Google Drive , Google Sheets , and Google Slides. Chat-GPT 's free edition for code snippets. Step-by-Step Solution: 1. Collect your data Upload the CSV files to Google Drive. Copy the Drive folder ID. The folder ID is the part of the URL