ai-goofish-monitor is an open-source Goofish product monitoring system from Usagi-org.
Its goal is clear: automate Goofish search, filtering, product analysis, result logging, and notifications, so users can find matching second-hand listings faster. The project uses Playwright for browser automation and connects to image-capable AI models to judge product information more intelligently.
Project link: https://github.com/Usagi-org/ai-goofish-monitor
The Short Version
ai-goofish-monitor is closer to a “Goofish purchasing intelligence dashboard” than a simple keyword alert script.
It has several notable traits:
- A complete web admin UI for managing tasks, accounts, AI criteria, logs, and results.
- Concurrent multi-task monitoring, where each task can define keywords, price range, filters, and AI Prompt.
- Playwright-based page automation, useful for scenarios that require login state and page interaction.
- AI-based product judgment, not just keyword matching.
- Notifications through ntfy.sh, WeCom, Bark, Telegram, Webhook, and other channels.
- Cron scheduling, multi-account management, proxy rotation, retry handling, and Docker deployment.
It fits users who often search Goofish for specific items, such as second-hand electronics, cameras, GPUs, hard drives, game consoles, musical instruments, appliances, and collectibles. But it is not an “automatic bargain hunter.” Goofish search results change, login states may expire, and platform risk controls can affect automation stability. Treat it as an assisted filtering tool, not a replacement for human judgment.
What Problem It Solves
Finding second-hand products on Goofish often has several pain points:
- There are too many listings to browse manually.
- Titles and descriptions are inconsistent, so keywords can miss or misclassify listings.
- Good prices appear briefly and may be gone by the time you see them.
- The same product may differ by region, price, condition, and seller.
- Low-priced listings can include accessories, damaged goods, refurbished items, or misleading titles.
- Watching multiple keywords continuously is hard to sustain manually.
Basic keyword alerts solve only part of this. Searching for “ThinkPad X1” may mix in accessories, broken screens, empty boxes, or disassembled parts. Searching for “Sony A7C” may return lens bundles, rentals, clickbait titles, or abnormal prices.
ai-goofish-monitor’s idea is to use automation to collect candidate listings, then let AI apply your criteria, and finally push the results worth attention.
Core Features
The project README lists a fairly complete feature set:
- Visual web management: task management, account management, AI criteria editing, run logs, and result browsing.
- AI-driven workflow: create tasks with natural language and analyze products with multimodal models.
- Concurrent tasks: each task can define keywords, prices, filters, and AI Prompt independently.
- Advanced filters: free shipping, newly listed time range, and province / city / district filtering.
- Instant notifications: ntfy.sh, WeCom, Bark, Telegram, Webhook, and more.
- Scheduled execution: Cron-based periodic tasks.
- Account and proxy rotation: multi-account management, task-account binding, proxy pool rotation, and retry handling.
- Docker deployment: containerized deployment support.
Together, these cover the full chain from creating a monitoring task to receiving a matching alert.
Workflow
A typical workflow looks like this:
- Deploy the service and open the Web UI.
- Import Goofish account login state.
- Create a monitoring task.
- Set keywords, price range, region, newly listed window, and other filters.
- Write criteria or let AI generate them.
- Run the task in real-time or on a schedule.
- Playwright opens pages and extracts product information.
- AI checks title, description, images, and Prompt against your needs.
- Matching results are written to SQLite.
- The system sends notifications through configured channels.
- You review results, logs, and price history in the Web UI.
AI matters most at step 8. It can understand natural language criteria like “good condition, reasonable price, no accessories, no repaired units, preferably local pickup,” which is more flexible than simple keyword rules.
Docker Deployment
The project recommends Docker deployment:
|
|
The default Web UI address is:
|
|
The official image is:
|
|
If the image pulls slowly, the README also gives an accelerated mirror example:
|
|
The Docker image includes Chromium, so the host does not need an extra browser. Default persistent directories include:
data/: main SQLite storage for tasks, results, and price history.state/: login-state cookie files.prompts/: task prompts.logs/: runtime logs.images/: product images and temporary task image folders.
If you change SERVER_PORT in .env, also update the port mapping in docker-compose.yaml.
Minimum Configuration
The minimum configuration mainly covers the AI model and Web UI login:
|
|
The first three are required for AI model access:
OPENAI_API_KEY: model API key.OPENAI_BASE_URL: OpenAI-compatible endpoint.OPENAI_MODEL_NAME: model name that supports image input.
WEB_USERNAME and WEB_PASSWORD are used for Web UI login. The README mentions the default credentials admin/admin123; change them in production.
First Use
The first-use flow is roughly:
- Open
http://127.0.0.1:8000. - Log in to the Web UI.
- Go to Goofish account management.
- Use the provided Chrome extension to export Goofish login-state JSON.
- Paste the login state into the system.
- The state file is saved to
state/, for examplestate/acc_1.json. - Return to task management, create a task, and bind an account.
- Run the task and check results.
The key part is login state. Goofish does not provide a standard open API for arbitrary third-party scraping, so the project uses browser login state to simulate normal page access. Expired login state, risk controls, captchas, and account anomalies can all affect task execution.
AI Tasks and Keyword Tasks
The project supports two task creation modes.
The first is AI判断.
You can enter detailed requirements, and the system asynchronously generates analysis criteria. This fits complex needs, for example:
- Only the main unit, no accessories.
- Alert only when the price is clearly below market.
- Good condition, with no water damage, repair, or hidden defects in the description.
- Prefer local listings and face-to-face pickup.
- Images should show serial number, packaging, or key accessories.
The second is 关键词判断.
This is closer to traditional rule-based monitoring: create tasks directly from keywords, prices, regions, and other conditions without AI generation. It fits simple rules where some false positives are acceptable.
In practice, you can mix both: keywords handle initial filtering, AI reduces false positives.
What the Web UI Does
The Web UI is a major difference between this project and ordinary scripts.
The task management page can configure:
- AI-created tasks.
- Keyword rules.
- Price ranges.
- Newly listed windows.
- Region filters.
- Account binding.
- Scheduling rules.
The account management page can:
- Import Goofish account login state.
- Update login state.
- Delete accounts.
- Assign accounts to tasks.
- Let the system choose accounts automatically.
Results and logs pages can:
- View matching products.
- Export results.
- Search history.
- Inspect task execution.
- Troubleshoot login expiration, risk controls, and AI call issues.
The system settings page can:
- View system status.
- Edit Prompt.
- Adjust proxy and rotation configuration.
For long-term monitoring, the Web UI is important. Without it, many tasks quickly make configuration, logs, results, and notifications hard to maintain.
Data Storage
The current online primary storage is SQLite, with the default path:
|
|
Docker mounts the SQLite main database by default:
|
|
On startup, the app creates tables automatically and tries to import historical data once from old config.json, jsonl/, and price_history/.
Note that state/, prompts/, logs/, and images/ remain filesystem directories and are not stored in SQLite. Product images are temporarily saved to directories like:
|
|
They are cleaned up by default after the task ends.
This structure fits personal or small-team deployments: SQLite is lightweight and easy to migrate, while files keep login state, images, and logs easy to inspect.
Notification Channels
The project supports multiple notification channels. Common configuration items include:
NTFY_TOPIC_URLGOTIFY_URL/GOTIFY_TOKENBARK_URLWX_BOT_URLTELEGRAM_BOT_TOKEN/TELEGRAM_CHAT_ID/TELEGRAM_API_BASE_URLWEBHOOK_*
Notifications are central to the experience. If a monitor only writes results to a backend, users still need to keep checking the page. With push notifications, matching listings reach the user immediately.
A practical setup is to tier notifications by item value:
- Normal keyword hits are written only to the backend.
- High-confidence AI results are pushed to the phone.
- High-value items are pushed to WeCom or Telegram.
- Enable more logs during debugging, then reduce noise after stabilization.
Developer Run
Without Docker, local development requires:
- Python 3.10+
- Node.js + npm
- Playwright CLI
- Chromium or Chrome / Edge browser
Basic commands:
|
|
One-command startup:
|
|
start.sh checks Playwright CLI and browser conditions, installs dependencies, builds the frontend, copies build artifacts, and starts the backend.
Start the backend manually:
|
|
Or:
|
|
Frontend development:
|
|
Testing and build:
|
|
Who It Fits
ai-goofish-monitor fits users who:
- Frequently watch Goofish for specific models.
- Want to monitor second-hand electronics, cameras, game devices, hardware parts, and similar goods.
- Want to automate “keyword search + manual screening.”
- Have an OpenAI-compatible model API and accept AI judgment costs.
- Are comfortable with Docker or basic command-line deployment.
- Need matching results pushed to phone, WeCom, or Telegram.
It is less suitable if you:
- Know nothing about deployment and only want a ready-to-use app.
- Do not want to handle login state, captchas, or account risk controls.
- Need official authorization and strongly compliant data APIs.
- Want large-scale high-frequency platform scraping.
- Expect AI to automatically judge trading risk and place orders for you.
Risks and Boundaries
Tools like this need clear boundaries.
First, follow platform rules.
Goofish has its own terms, risk controls, and account security mechanisms. Automation may trigger restrictions. Do not scrape at high frequency, bypass risk controls, harass sellers, bulk-collect private information, or disrupt platform order.
Second, protect account login state.
Files in state/ are login-state cookie files. They are effectively account access credentials. Do not commit them to Git, and do not put them on untrusted servers. If the server is exposed to the public internet, change the default Web UI password and place it behind a VPN, reverse-proxy authentication, or an internal network.
Third, AI judgment is not a factual guarantee.
AI can reduce false positives, but it cannot guarantee product authenticity, seller trustworthiness, reasonable pricing, or transaction safety. You still need to review product details, seller reputation, chat history, shipping method, and payment process.
Fourth, watch cost.
If every candidate listing is analyzed by a multimodal model, costs can rise quickly. Use keyword, price, and region filters first, then send a smaller candidate set to AI.
Fifth, protect privacy.
Product screenshots, chat-related content, account state, and notification content may all contain sensitive information. Protect notification Webhooks, log directories, and databases carefully.
Difference from Ordinary Scripts
Ordinary Goofish monitoring scripts usually do three things:
- Search keywords.
- Check prices.
- Send notifications.
ai-goofish-monitor goes further:
- Manage tasks and accounts through Web UI.
- Express complex buying criteria with AI Prompt.
- Use multimodal models to inspect product images and descriptions.
- Store results and price history in SQLite.
- Use log pages to diagnose task failures.
- Improve stability through proxy rotation and multi-account mechanisms.
- Support long-running schedules with Cron.
Because it does more, it also costs more to deploy and maintain. For normal users, Docker is the easiest path. For developers, the Web UI, FastAPI, Playwright, and SQLite structure is also friendly for secondary development.
How to Use It
A practical approach is to start with small tasks.
For example, if you want a second-hand camera, create a task like this:
- Keywords:
A7C,索尼 A7C - Price range: set an upper limit based on market price
- Region: prefer same province or local city
- Newly listed window: last day or last few hours
- AI criteria: exclude lens-only listings, repaired units, obvious accessories; pay attention to shutter count and condition
- Notification: push only results that pass AI judgment
After it runs stably, gradually add more tasks. Do not start with dozens of keywords, multiple accounts, and high-frequency Cron. First check login stability, false-positive rate, AI cost, and notification noise, then tune parameters.
Summary
ai-goofish-monitor moves Goofish monitoring from a “keyword script” to a manageable AI monitoring system. It uses Playwright for page automation, AI for complex judgment, Web UI for tasks and results, SQLite for storage, and multiple notification channels for delivery.
It is best for individuals or small teams monitoring specific products, especially second-hand electronics, hardware, and cameras where prices fluctuate, timing matters, and descriptions are noisy.
But it must be used carefully: protect login state, change default passwords, keep scraping frequency restrained, review AI results manually, and respect platform rules and privacy boundaries. As an assisted filtering tool, it can be valuable. As a fully automated trading system, it is easy to overestimate.
References: