Gunbot's AutoConfig feature is a versatile tool for automating trading strategies, and its ability to utilize historical market data is key to many advanced filtering techniques. The amount of this historical data AutoConfig retains is controlled by the history
parameter within each job's configuration. Properly setting this parameter allows you to balance in-depth market analysis with system resource management.
Nearly every option that follows can be set without editing files by hand.
Click the โฎ (three-dots) menu โ AutoConfig, step through the wizard, and press Save; it will write a correct autoconfig.json
for you.
The Role of Historical Data in AutoConfigโ
While the snapshots
parameter in an AutoConfig job manages a rolling window of very recent ticker data, the history
parameter deals with a longer-term, but typically less granular, set of data points. Think of it as two tiers of data retention:
- Snapshots: High-frequency, recent data. If a job runs every minute and
snapshots
is 60, you have detailed data for the last hour. - History: Lower-frequency, older data. If
history
is 100 andhistoryInterval
is 15 (minutes), AutoConfig stores 100 data points, each representing the market state at 15-minute intervals, potentially covering a much longer period (100 * 15 minutes = 25 hours).
This historical data is particularly useful for filters that need to analyze market conditions over extended periods, such as identifying overarching trends, support and resistance levels over many hours or days, or comparing current volatility to a longer-term baseline.
Configuring the history
Parameterโ
The history
parameter is a numerical value set within an AutoConfig job's configuration. It dictates the maximum number of historical data points the job will store.
Consider this example snippet from an AutoConfig JSON file:
{
"anotherJob": {
"enabled": true,
"schedule": "*/1 * * * *", // Job runs every minute
"type": "addPairs",
"snapshots": 60,
"history": 100, // User setting: Number of historical data points to store
"historyInterval": 15, // User setting: Interval in minutes for history points
"pairs": {
"exchange": "kraken",
"include": "USDT-BTC,USDT-ETH"
// ... other pair settings
},
"filters": {
// Example filter that might use historical data
"priceChangeHistoryFilter": {
"filterType": "minPricePctChangeIntervalHistory",
"historySource": 0, // Use the 0th (oldest) history entry as baseline
"minChange": 5 // Triggers if current price is 5% above oldest history price
}
}
// ... other job settings
}
}
In this configuration for "anotherJob":
history: 100
tells AutoConfig to keep up to 100 historical data points.historyInterval: 15
(which translates to 15 minutes because the base unit forhistoryInterval
is minutes, often multiplied by 60 in internal logic to represent seconds) means that a new historical data point is considered for storage roughly every 15 minutes. The actual mechanism involves taking the oldest snapshot (from thesnapshots
pool) and moving it to the history pool when the time criteria, based onhistoryInterval
, are met.
When a new historical data point is added, if the count exceeds the value of history
, the oldest historical point is removed. This ensures the historical data buffer doesn't grow indefinitely.
Impact of the history
Settingโ
The value assigned to the history
parameter has several important implications:
- Depth of Historical Analysis: A higher
history
value, combined with an appropriatehistoryInterval
, allows filters to analyze market behavior over a more extended past period. This is crucial for strategies that rely on identifying long-term trends or cyclical patterns. - Filter Effectiveness: Certain filters, like
minPricePctChangeIntervalHistory
ormaxSlopePctIntervalHistory
, explicitly use this historical data. ThehistorySource
sub-parameter within these filters often refers to an index within the stored historical data array. A largerhistory
pool provides more data points for these filters to reference. - Resource Consumption: Each historical data point consumes memory and, if persisted, disk space. While historical data points are typically less frequent than snapshots, a very large
history
value can still contribute to resource load, especially if many jobs are running with highhistory
counts. - Data Granularity: The
historyInterval
setting determines the time gap between consecutive historical data points. A smallerhistoryInterval
with a largehistory
count means more granular data over a long period, but also faster turnover of historical points and potentially more processing.
Choosing an Appropriate history
Valueโ
The optimal history
value is strategy-dependent:
- For strategies needing several hours of context: A
history
value of 48 with ahistoryInterval
of 5 (minutes) would provide 48 data points covering 240 minutes (4 hours). - For strategies analyzing daily or multi-day patterns: You might use a
history
value of 96 with ahistoryInterval
of 15 (minutes), covering 24 hours of data. Or,history: 168
withhistoryInterval: 60
for 7 days of hourly data points. - Resource Constraints: Always be mindful of your system's RAM and disk space. If using a Raspberry Pi or a small VPS, you might need to be more conservative with
history
values compared to a powerful desktop machine.
Key Considerations:
historyInterval
Synergy: Thehistory
parameter is most effective when considered alongsidehistoryInterval
. The total duration covered by historical data ishistory * historyInterval
.- Filter Requirements: If your filters use
historySource
to pick a specific historical point (e.g.,historySource: 0
for the oldest,historySource: 99
for the most recent ifhistory
is 100), ensure yourhistory
value is large enough to accommodate the indices your filters reference. - Data Relevance: Storing data from too far in the past might not be relevant for some fast-moving markets or strategies. Tailor the retention period to what is actionable for your trading logic.
By carefully setting the history
parameter, you provide AutoConfig with the necessary depth of market context, enabling more sophisticated and potentially more effective automated trading decisions. Remember to balance the need for data with the practical limits of your trading system's resources.