Difference between revisions of "Mirhoseini2011learning"

From ACES

(Import from BibTeX)
 
m (Import from BibTeX)
Line 4: Line 4:
|abstract=<p>The operability of a portable embedded system is severely constrained by its supply\&rsquo;s duration. We propose a novel energy management strategy for a combined (hybrid) supply consisting of a battery and a set of supercapacitors to extend the system\&rsquo;s lifetime. Batteries are not sufficient for handling high load fluctuations and demand in modern complex systems. Supercapacitors hold promise for complementing battery supplies because they possess higher power density, a larger number of charge/recharge cycles, and less sensitivity to operational conditions. However, supercapacitors are not efficient as a standalone supply because of their comparatively higher leakage and lower energy density. Due to the nonlinearity of the hybrid supply elements, multiplicity of the possible supply states, and the stochastic nature of the workloads, deriving an optimal management policy is a challenge. We pose this problem as a stochastic Markov Decision Process (MDP) and develop a reinforcement learning method, called Q-learning, to derive an efficient approximation for the optimal management strategy. This method studies a diverse set of workload profiles for a mobile platform and learns the best policy in form of an adaptive approximation approach. Evaluations on measurements collected from mobile phone users show the effectiveness of our proposed method in maximizing the combined embedded system\&rsquo;s lifetime.</p>
|abstract=<p>The operability of a portable embedded system is severely constrained by its supply\&rsquo;s duration. We propose a novel energy management strategy for a combined (hybrid) supply consisting of a battery and a set of supercapacitors to extend the system\&rsquo;s lifetime. Batteries are not sufficient for handling high load fluctuations and demand in modern complex systems. Supercapacitors hold promise for complementing battery supplies because they possess higher power density, a larger number of charge/recharge cycles, and less sensitivity to operational conditions. However, supercapacitors are not efficient as a standalone supply because of their comparatively higher leakage and lower energy density. Due to the nonlinearity of the hybrid supply elements, multiplicity of the possible supply states, and the stochastic nature of the workloads, deriving an optimal management policy is a challenge. We pose this problem as a stochastic Markov Decision Process (MDP) and develop a reinforcement learning method, called Q-learning, to derive an efficient approximation for the optimal management strategy. This method studies a diverse set of workload profiles for a mobile platform and learns the best policy in form of an adaptive approximation approach. Evaluations on measurements collected from mobile phone users show the effectiveness of our proposed method in maximizing the combined embedded system\&rsquo;s lifetime.</p>
|pages=in press
|pages=in press
|month=8
|year=2011
|booktitle=International Symposium on Low Power Electronics and Design (ISLPED)
|booktitle=International Symposium on Low Power Electronics and Design (ISLPED)
|title=Learning to Manage Combined Energy Supply Systems
|title=Learning to Manage Combined Energy Supply Systems
|entry=inproceedings
|entry=inproceedings
|date=2011-Au-01
}}
}}

Revision as of 03:38, 4 September 2021

Mirhoseini2011learning
entryinproceedings
address
annote
authorAzalia Mirhoseini and Farinaz Koushanfar
booktitleInternational Symposium on Low Power Electronics and Design (ISLPED)
chapter
edition
editor
howpublished
institution
journal
month8
note
number
organization
pagesin press
publisher
school
series
titleLearning to Manage Combined Energy Supply Systems
type
volume
year2011
doi
issn
isbn
urlhttp://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=5993641
pdf


Icon-email.png
Email:
farinaz@ucsd.edu
Icon-addr.png
Address:
Electrical & Computer Engineering
University of California, San Diego
9500 Gilman Drive, MC 0407
Jacobs Hall, Room 6401
La Jolla, CA 92093-0407
Icon-addr.png
Lab Location: EBU1-2514
University of California San Diego
9500 Gilman Dr, La Jolla, CA 92093