Threat Hunting cookbook

Copy-paste recipes to hunt with TweetFeed IOCs in your terminal, SIEM and firewall


Threat Hunting cookbook

Recipes for terminal, SIEM, firewall



1. Quick start - get raw IOCs

Four ways to ingest the feed. Pick the one that matches your downstream tool.

All IOCs from today as JSON:

curl -s https://api.tweetfeed.live/v1/today | jq .

Filter by IOC type (url, domain, ip, sha256, md5):

curl -s https://api.tweetfeed.live/v1/today/url

Combine type and tag (this week's CobaltStrike IPs):

curl -s https://api.tweetfeed.live/v1/week/ip/CobaltStrike

Response shape (one IOC per array element):

{
  "date": "2026-05-03 06:07:00",
  "user": "PhishStats",
  "type": "domain",
  "value": "evil.example.shop",
  "tags": ["#phishing"],
  "tweet": "https://x.com/PhishStats/status/..."
}

CSVs cover four windows: today, week, month, year. Header: date,user,type,value,tags,tweet.

wget https://raw.githubusercontent.com/0xDanielLopez/TweetFeed/master/today.csv
wget https://raw.githubusercontent.com/0xDanielLopez/TweetFeed/master/week.csv
wget https://raw.githubusercontent.com/0xDanielLopez/TweetFeed/master/month.csv

RSS feed with the latest IOCs (refreshed every 15 min):

curl -s https://raw.githubusercontent.com/0xDanielLopez/TweetFeed/master/rss.xml

TweetFeed publishes a MISP-format manifest with 4 Events (today / week / month / year). In MISP, go to Sync Actions → Feeds → Add Feed:

URL:           https://raw.githubusercontent.com/0xDanielLopez/TweetFeed/master/misp/manifest.json
Source format: misp
Provider:      TweetFeed
Auto-pull:     yes
Default tags:  TweetFeed, tlp:clear, type:OSINT

Save, click Fetch and store all feed data to seed, then enable Cron Jobs → Feed Pull for auto-refresh.


2. Terminal & shell

Four recipes you can paste into a shell right now. All return clean stdout suitable for piping into your tooling.

2.1 Daily blocklist refresh - cron-friendly, runs every 15 min and updates a flat file:

*/15 * * * * curl -sfL https://api.tweetfeed.live/v1/today/url \
  | jq -r '.[].value' | sort -u > /etc/blocklists/tweetfeed-urls.txt

2.2 Slice by malware family tag - only IPs tagged Cobalt Strike from this week:

curl -sL https://api.tweetfeed.live/v1/week/ip/CobaltStrike | jq -r '.[].value'

2.3 Pivot from a researcher - every IOC published this week by @malwrhunterteam (filter is client-side because the REST endpoint does not accept user as a path filter):

curl -sL https://api.tweetfeed.live/v1/week \
  | jq -r '.[] | select(.user=="malwrhunterteam") | "\(.type)\t\(.value)\t\(.tags|join(","))"'

2.4 IOC to source tweet - every IOC carries provenance, useful before you alert on it:

curl -sL https://api.tweetfeed.live/v1/today \
  | jq -r '.[] | select(.value=="evil.example.com") | .tweet'

3. SIEM playbooks

Use TweetFeed as a lookup table or external source against your log indexes.

Setup - schedule a saved search to refresh the lookup table from the TweetFeed CSV every 15 min (savedsearches.conf):

[TweetFeed_today_lookup_refresh]
search = | inputlookup append=t external_lookup tweetfeed_today.csv \
         | outputlookup tweetfeed_today.csv
cron_schedule = */15 * * * *
dispatch.earliest_time = -15m
enableSched = 1

3.1 Match outbound proxy logs vs TweetFeed URLs and domains:

index=proxy [
  | inputlookup tweetfeed_today.csv
  | where type IN ("url","domain")
  | rename value AS url
  | fields url
] | stats count, values(tags), values(tweet) by src_ip, url

3.2 DNS query enrichment - any DNS query that hits a TweetFeed domain in the last 7d:

index=dns sourcetype=dns:query [
  | inputlookup tweetfeed_week.csv
  | where type="domain" | rename value AS query | fields query
] | table _time, src_ip, query, [tweetfeed.tags], [tweetfeed.tweet]

3.3 Real-time alert - save the previous query as a Notable Event with severity=medium and trigger immediately when count > 0.

3.4 externaldata against today's CSV - DNS hits in last 24h:

let TweetFeedDomains = externaldata(date:string, user:string, type:string, value:string, tags:string, tweet:string)
  [@"https://raw.githubusercontent.com/0xDanielLopez/TweetFeed/master/today.csv"]
  with (format="csv", ignoreFirstRecord=true)
  | where type == "domain"
  | project Domain = value, Tags = tags, Tweet = tweet;
DnsEvents
| where TimeGenerated > ago(24h)
| where Name in (TweetFeedDomains | project Domain)
| join kind=leftouter TweetFeedDomains on $left.Name == $right.Domain
| project TimeGenerated, Computer, ClientIP, Name, Tags, Tweet

3.5 SecurityEvent against TweetFeed IPs - logon source pivot against week.csv:

let TweetFeedIPs = externaldata(date:string, user:string, type:string, value:string, tags:string, tweet:string)
  [@"https://raw.githubusercontent.com/0xDanielLopez/TweetFeed/master/week.csv"]
  with (format="csv", ignoreFirstRecord=true)
  | where type == "ip"
  | project IPAddress = value, Tags = tags, Tweet = tweet;
SecurityEvent
| where TimeGenerated > ago(7d)
| where IpAddress in (TweetFeedIPs | project IPAddress)
| join kind=leftouter TweetFeedIPs on $left.IpAddress == $right.IPAddress
| project TimeGenerated, Computer, Account, IpAddress, Tags, Tweet

Defender for Endpoint advanced hunting (KQL). The 4 sub-tabs cover every IOC type. Swap week.csv for today.csv or month.csv depending on your freshness vs coverage trade-off.

MDE advanced hunting docs


4. Detection content

Generate a Sigma rule template that any modern SIEM can compile to its query language.

4.1 Sigma rule template - DNS lookup to a TweetFeed-listed domain. Populate %TweetFeedDomains% via your Sigma processing pipeline (e.g. sigmac with a lookup expansion):

title: TweetFeed-listed Malicious Domain Resolution
id: f1c0c0d0-1111-4444-9999-aaaaaaaaaaaa
status: experimental
description: Detects DNS lookups to domains published by TweetFeed within the last week
references:
  - https://tweetfeed.live/
  - https://api.tweetfeed.live/v1/week/domain
author: TweetFeed
date: 2026/05/03
logsource:
  product: windows
  service: sysmon
detection:
  selection:
    EventID: 22
    QueryName|expand: '%TweetFeedDomains%'
  condition: selection
falsepositives:
  - Allowlisted internal infrastructure
level: medium
tags:
  - attack.command_and_control
  - attack.t1071.004

Frequently asked questions

How fresh are the IOCs?

TweetFeed scrapes the source feeds every 15 minutes via cron. The end-to-end pipeline (scrape, dedupe, classify, aggregate, push) takes roughly 17 to 25 seconds, so an IOC tweeted right now lands in today.csv and the API within one cron tick.

Is there a rate limit on the API?

api.tweetfeed.live runs as a Cloudflare Worker on the Free plan (100,000 requests per day). Real traffic sits around 7 to 8K per day, leaving roughly 300x headroom. There is no authentication or per-key throttling - please be reasonable.

Can I use TweetFeed data commercially?

Yes. The IOC data is published under CC0 1.0 (no rights reserved). Attribution is appreciated but not required. The website code, branding and logos are not included in CC0.

How do I credit the original researcher?

Every IOC carries a user field (Twitter/X handle) and a tweet field (full URL to the source post). Link back to the tweet whenever you republish or alert on an IOC.

How do I handle false positives?

TweetFeed publishes everything reasonably tagged and does not suppress noise. Use a client-side allowlist (the KQL and SPL recipes show the pattern) and report false positives via the public Featurebase board.

Is there an MCP server for AI agents?

Yes. mcp.tweetfeed.live exposes 5 tools: query_iocs, check_url, check_ip, check_hash and list_recent_iocs. See the Agents page for the config snippet.

How do I contribute new IOCs?

Post the IOC on Twitter/X with one of the tracked hashtags (full list on the Feeds page) or get added to the curated TweetFeedList. The next cron tick picks it up automatically.