Open Issues Need Help
View All on GitHubAI Summary: Orchestrate a complete end-to-end demo showcasing data ingestion, database storage, backtesting, and result export. Following the demo, gather feedback, document key learnings, and create a plan for improvements in Phase 2.
AI Summary: This task requires writing at least 30 Pytest unit and integration tests for data ingestion, backtesting, and CLI modules of a project, integrating them into the GitHub Actions CI pipeline, and setting minimum code coverage thresholds.
AI Summary: Develop a command-line interface (CLI) tool named 'tb' with a 'backtest' command. This command should accept a stock symbol and a date range as input, perform backtesting simulations, and output the results in both CSV and JSON formats.
AI Summary: Develop a microservice that consumes real-time trade and order book data from the Binance WebSocket API, normalizes it, aggregates it into bar data, and publishes the results to Kafka topics for other services to consume.
AI Summary: Develop a vectorized backtesting engine using NumPy and pandas to evaluate trading strategies on daily market data. The engine should account for slippage and commissions, and produce key performance metrics.
AI Summary: Create a Binance OHLCV data fetcher with a daily cron job for backfilling missing data. The solution should include robust error handling and data validation to ensure data completeness and accuracy.
AI Summary: This task requires setting up a Dockerized TimescaleDB database with PGAdmin for administration, configuring database migrations using Alembic, and writing unit tests to ensure the database schema, migration process, and connectivity are all working correctly.