J A B B Y A I

Loading

The prompt is almost wrapped, my fellow YOLOers!

It’s 4:20 am , I’m running on the last fumes of Monster, and my fingertips are ground beef from all this FINGER BLASTING!

See you tomorrow with the final touches!

Just need to build out the tables, scrape the data, and test before Monday….

WHOSE READY FOR TENDIE TOWN!!!!???

Build a Stock Option Analysis and Trade Picker Prompt:

Step 1: Understand what data to collect.

Create a List of Data Needed

**Fundamental Data:** to identify undervalued growth stocks or overhyped ones. Data Points: Earnings Per Share, Revenue , Net Income, EBITDA, P/E Ratio , PEG Ratio, Price/Sales Ratio, Forward Guidance, Gross and Operating Margins, Free Cash Flow Yield, Insider Transactions **Options Chain Data:** to identify how expensive options are. Data Points: **Implied Volatility, IV Rank, IV Percentile, Delta, Gamma, Theta, Vega, Rho, Open Interest by strike/expiration, Volume by strike/expiration, Skew / Term Structure** **Price&Volume Histories**:Blend fundamentals with technicals to time entries. Data Points: Daily OHLCV (Open, High, Low, Close, Volume), Intraday (1m/5m), Historical Volatility, Moving Averages (50/100/200 day), ATR (Average True Range), RSI (Relative Strength Index), MACD (Moving Average Convergence Divergence), Bollinger Bands, Volume-weighted Average Price (VWAP), Pivot Points, Price momentum metrics Alt Data:Predicts earnings surprises, demand shifts,sentiment spikes. Data Points: Social Sentiment (Twitter (X), Reddit), Web-Scraped Reviews (Amazon, Yelp), Credit Card Spending Trends, Geolocation foot traffic (Placer.ai), Satellite Imagery (Parking lots), App download trends (Sensor Tower), Job Postings (Indeed, Linkedin), Product Pricing Scrape, News event detection (Bloomberg, Reuters, NYT, WSJ), Google Trends search interest Macro Indicator:shape market risk appetite, rates, and sector rotations. Data Points: CPI (Inflation), GDP growth rate, Unemployment rate, FOMC Minutes/decisions, 10-year Treasury yields, VIX (Volatility Index), ISM Manufacturing Index, Consumer Confidence Index, Nonfarm Payrolls, Retail Sales Reports, Sector-specific Vol Indices ETF & Fund Flows: can cause **mechanical buying or selling pressure Data Points: SPY, QQQ flows, Sector ETF inflows/outflows (XLK, XLF, XLE), ARK fund holdings and trades, Hedge fund 13F filings, Mutual fund flows, ETF short interest, Leveraged ETF rebalancing flows, Index reconstruction announcements, Passive vs active share trends, Large redemption notices** Analyst Rating & Revision: Positive revisions linked to **alpha generation. Data Points: Consensus target price, Recent upgrades/downgrades, Earnings estimate revisions, Revenue estimate revisions, Margin estimate changes, New coverage initiations, Short interest updates, Institutional ownership changes, Sell-side model revisions, Recommendation dispersion** 

Step 2: Collect, Store and Clean the Data.

Create your Database

##Install Homebrew /bin/bash -c "$(curl -fsSL <https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh>)" ##Enter Password Use the Password you use to log into Laptop ##Enter Password again Use the Password you use to log into Laptop ##Add Homebrew to your PATH (enter each line individually) echo >> /Users/alexanderstuart/.zprofile echo 'eval "$(/opt/homebrew/bin/brew shellenv)"' >> /Users/alexanderstuart/.zprofile eval "$(/opt/homebrew/bin/brew shellenv)" ##Test that Homebrew Works brew --version ##Install Postgres brew install postgresql ##Start PostgreSQL as a background service brew services start postgresql@14 ##Confirm PostgreSQL is running pg_ctl -D /opt/homebrew/var/postgresql@14 status ##Create your database createdb trading_data ##Connect to your database psql trading_data 

Create the Data Tables

  • Create Fundamental Data Table
  • Create Options Chain Data Table
  • Create Price & Volume Histories Table
  • Create Alternative Data Table
  • Create Macro Indicator Data Table
  • Create ETF & Fund Flows Data Table
  • Create Analyst Rating & Revision Data Table

Import Data into the Data Tables

  • Import Fundamental Data
  • Import Options Chain Data
  • Import Price & Volume Histories
  • Import Alternative Data
  • Import Macro Indicator Data
  • Import ETF & Fund Flows Data
  • Import Analyst Rating & Revision Data

Step 3: Transform and Merge Data

Transform Data Tables into the Derived Numeric Features

  • Transform Fundamental Data into Fundamentals Quarterly
  • Transform Options Chain Data into Options Spreads
  • Transform Price & Volume Histories into Daily Technicals
  • Transform Alternative Data into Sentiment Scores
  • Transform Macro Indicator Data into
  • Transform ETF & Fund Flows Data into ETF Flows
  • Transform Analyst Rating & Revision Data into Raw Analyst Feed

Step 4: Write Prompt and Paste Data

System You are ChatGPT, Head of Options Research at an elite quant fund. All heavy maths is pre-computed; you receive a JSON list named <payload>. Each record contains: { "ticker": "AAPL", "sector": "Tech", "model_score": 0.87, // higher = better edge "valuation_z": -0.45, // neg = cheap "quality_z": 1.20, // pos = high margins/ROE "momentum_z": 2.05, // pos = strong up-trend "alt_sent_z": 1.80, // pos = bullish chatter "flow_z": 1.10, // pos = ETF money flowing in "quote_age_min": 4, // minutes since quote "top_option": { "type" : "bull_put_spread", "legs" : ["190P","185P"], "credit" : 1.45, "max_loss" : 3.55, "pop" : 0.78, "delta_net": -0.11, "vega_net" : -0.02, "expiry" : "2025-08-15" } } Goal Return exactly **5 trades** that, as a basket, maximise edge while keeping portfolio delta, vega and sector exposure within limits. Hard Filters (discard any record that fails): • quote_age_min ≤ 10 • top_option.pop ≥ 0.65 • top_option.credit / top_option.max_loss ≥ 0.33 • top_option.max_loss ≤ 0.5 % of assumed 100 k NAV (i.e. ≤ $500) Selection Rules 1. Rank by model_score. 2. Enforce diversification: max 2 trades per GICS sector. 3. Keep net basket Delta in [-0.30, +0.30] × NAV / 100 k and net Vega ≥ -0.05 × NAV / 100 k. (Use the delta_net and vega_net in each record.) 4. If ties, prefer highest momentum_z and flow_z. Output Return a **JSON object** with: { "ok_to_execute": true/false, // false if fewer than 5 trades meet rules "timestamp_utc": "2025-07-27T19:45:00Z", "macro_flag" : "high_vol" | "low_vol" | "neutral", // pick from macro_snapshot "trades":[ { "id" : "T-1", "ticker" : "AAPL", "strategy" : "bull_put_spread", "legs" : ["190P","185P"], "credit" : 1.45, "max_loss" : 3.55, "pop" : 0.78, "delta_net" : -0.11, "vega_net" : -0.02, "thesis" : "Strong momentum + ETF inflows; spread sits 3 % below 50-DMA." }, …(4 more)… ], "basket_greeks":{ "net_delta": +0.12, "net_vega" : -0.04 }, "risk_note": "Elevated VIX; if CPI print on Aug 1 surprises hot, basket may breach delta cap.", "disclaimer": "For educational purposes only. Not investment advice." } Style • Keep each thesis ≤ 30 words. • Use plain language – no hype. • Do not output anything beyond the specified JSON schema. If fewer than 5 trades pass all rules, set "ok_to_execute": false and leave "trades" empty. 

https://preview.redd.it/vi9xyosknu9f1.jpg?width=2550&format=pjpg&auto=webp&s=ce0c57bc7b1e417ca21431cd335f86213327fb0f

https://preview.redd.it/qu8zqpsknu9f1.jpg?width=2550&format=pjpg&auto=webp&s=b9f9d64d2e660cfdd9453bd54e30647a44beed9e

https://preview.redd.it/tlscpqsknu9f1.jpg?width=2550&format=pjpg&auto=webp&s=88ded68956dbf9baf98ade88b2a7cecd89094a95

https://preview.redd.it/oo0sonsknu9f1.jpg?width=2550&format=pjpg&auto=webp&s=9ad90d0d1c8c75015db5ae3c36f1574f60ec8fab

https://preview.redd.it/08uxmrsknu9f1.jpg?width=2550&format=pjpg&auto=webp&s=5549f782de7e148d7df7fbf65db37e8e69a6458d

https://preview.redd.it/7tvk7psknu9f1.jpg?width=2550&format=pjpg&auto=webp&s=c23131829e8872dadb000ec686cfbec540855a86

https://preview.redd.it/cpr19osknu9f1.jpg?width=2550&format=pjpg&auto=webp&s=26517e1015e7c2a4194e17e5d5aa8fc4ef0d3e15

Step 5: Feed the Data and Prompt into ChatGPT

submitted by /u/Plastic-Edge-1654
[link] [comments]

Leave a Comment