In spark-ignited (SI) engines, the spark advance (SA) controls the combustion phase that has a significant impact on the efficiency. Online self-optimizing control (SOC) of SA to maximize the indicated fuel conversion efficiency (IFCE) forms a stochastic optimization problem for a static map due to the stochasticity of combustion. Gradient-based optimization algorithms using periodic dithers are effective methods of dealing with such problems. However, decision sequences corrupted by periodic dithers are undesirable in SA online SOC. To choose a proper decision sequence for this problem, a gradient descent-based dither-free SOC scheme iteratively updates the decision based on probabilistic guaranteed gradient learning (PGGL). The PGGL approach uses the statistical distribution of the past samples to approximate the gradient on which the sample size can be adaptively adjusted to achieve the probabilistic target. The proposed scheme not only guarantees the accuracy of gradient learning but also adaptively adjusts the sample size in the learning process, achieving a tradeoff between a rapid response and a stable decision sequence. Moreover, the convergence performance of the obtained decision sequence is analyzed with respect to the probability distribution. Finally, experimental validations performed on an SI engine test bench show that the proposed PGGL-based SOC scheme successfully manages engine operation around the optimal IFCE with a fast response and stable SA behavior, under both steady and mild transient conditions.

Download full-text PDF

Source
http://dx.doi.org/10.1109/TNNLS.2017.2767293DOI Listing

Publication Analysis

Top Keywords

decision sequence
12
probabilistic guaranteed
8
guaranteed gradient
8
spark advance
8
self-optimizing control
8
spark-ignited engines
8
periodic dithers
8
soc scheme
8
gradient learning
8
sample size
8

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!