Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Paper The following article is Open access

Delay-sensitive Task Scheduling with Deep Reinforcement Learning in Mobile-edge Computing Systems

, , and

Published under licence by IOP Publishing Ltd
, , Citation Hao Meng et al 2019 J. Phys.: Conf. Ser. 1229 012059 DOI 10.1088/1742-6596/1229/1/012059

1742-6596/1229/1/012059

Abstract

Mobile-edge computing(MEC) is considered to be a new network architecture concept that provides cloud-computing capabilities and IT service environment for applications and services at the edge of the network, and it has the characteristics of low latency, high bandwidth and real-time access to wireless network information. In this paper, we mainly consider task scheduling and offloading problem in mobile devices, in which the computation data of tasks that are offloaded to MEC server have been determined. In order to minimize the average slowdown and average timeout period of tasks in buffer queue, we propose a deep reinforcement learning (DRL) based algorithm, which transform the optimization problem into a learning problem. We also design a new reward function to guide the algorithm to learn the offloading policy directly from the environment. Simulation results show that the proposed algorithm outperforms traditional heuristic algorithms after a period of training.

Export citation and abstract BibTeX RIS

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Please wait… references are loading.
10.1088/1742-6596/1229/1/012059