中国高校课件下载中心 》 教学资源 》 大学文库

《物联网技术及应用》课程教学资料(参考资料)Toward the 6G Network Era - Opportunities and Challenges

文档信息
资源类别:文库
文档格式:PDF
文档页数:5
文件大小:763.85KB
团购合买:点击进入团购
内容简介
《物联网技术及应用》课程教学资料(参考资料)Toward the 6G Network Era - Opportunities and Challenges
刷新页面文档预览

ThemeArticle:EnvisioningOurFutureDigitalWorldToward the 6G Network Era:Opportunities andChallengesloannisTomkosEvangelosPikasisUniversityof PatrasEULAMBIAAdvancedTechnologiesLtdDimitrios KlonidisSergios TheodoridisUbitechNationalandKapodistrianUniversityofAthensAbstract-The next generation of telecommunication networks will integrate the latestdevelopments andemergingadvancements intelecommunicationsconnectivityinfrastructures.Inthis article,wediscussthetransformationandconvergenceofthefifthgeneration (5G)mobilenetwork andthe internet ofthings technologies,toward the emergenceofthesmartsixth-generation(6G)networkswhichwillemployAltooptimizeandautomatetheir operation.IN REcENT YEARS,wehaveexperienced aof connected devices,power consumption,bit-tremendousevolution ininformationand commu-rate, latency and availability,among others.Eachfamily of those use cases is focusing primarilynications technology withmajor breakthroughson optimization of one key design parameterin mobilenetworks.Early technology develop-(e.g., bit-rate or number of connected devicesments related with 5Gmobilenetworkstargetedorlatency).mainly the three generic and distinct use cases:Manyapplicationscanbeenabledbyearly5G"enhancedmobilebroadband"(eMBB),"massivenetworks,such as real-timeclosed-loop roboticmachine-type communications"and "ultra-reli-control,video-driven machine-human interacable low-latency communications."1 The associ-tion,and augmented reality/virtual reality (AR/atedapplicationshavestringent specificationsinVR)applicationswithhighdefinition360°videoterms of critical systemparameters likenumberstreaming with high bit rate requirements (inthe range of tens of gigabit per second) andDigitalObjectldentifier10.1109/MrTP.2019.2963491low latency(<10ms).It is foreseenthatevolving5G networks will enable even more demandingDate ofcurrentuersion 12February 2020.1520-92022020IEEEPublished by the IEEE Computer SocietyITProfessional34Authorized ficenseduselimitedto:IEEEXplore.Downloaded onMarch04,2021at04:06:59UTCfromIEEEXplore.Restrictions apply

Toward the 6G Network Era: Opportunities and Challenges Ioannis Tomkos University of Patras Dimitrios Klonidis Ubitech Evangelos Pikasis EULAMBIA Advanced Technologies Ltd Sergios Theodoridis National and Kapodistrian University of Athens Abstract—The next generation of telecommunication networks will integrate the latest developments and emerging advancements in telecommunications connectivity infrastructures. In this article, we discuss the transformation and convergence of the fifth￾generation (5G) mobile network and the internet of things technologies, toward the emergence of the smart sixth-generation (6G) networks which will employ AI to optimize and automate their operation. & IN RECENT YEARS, we have experienced a tremendous evolution in information and commu￾nications technology with major breakthroughs in mobile networks. Early technology develop￾ments related with 5G mobile networks targeted mainly the three generic and distinct use cases: “enhanced mobile broadband” (eMBB), “massive machine-type communications" and “ultra-reli￾able low-latency communications.”1 The associ￾ated applications have stringent specifications in terms of critical system parameters like number of connected devices, power consumption, bit￾rate, latency and availability, among others. Each family of those use cases is focusing primarily on optimization of one key design parameter (e.g., bit-rate or number of connected devices or latency). Many applications can be enabled by early 5G networks, such as real-time closed-loop robotic control, video-driven machine–human interac￾tion, and augmented reality/virtual reality (AR/ VR) applications with high definition 360 video streaming with high bit rate requirements (in the range of tens of gigabit per second) and low latency (<10 ms). It is foreseen that evolving 5G networks will enable even more demanding Digital Object Identifier 10.1109/MITP.2019.2963491 Date of current version 12 February 2020. Theme Article: Envisioning Our Future Digital World 34 1520-9202  2020 IEEE Published by the IEEE Computer Society IT Professional Authorized licensed use limited to: IEEE Xplore. Downloaded on March 04,2021 at 04:06:59 UTC from IEEE Xplore. Restrictions apply

gained significant momentum recently asapplications,such as sharing and updatingsithigh-resolution maps in real timefor control oftransforms5Gmobilecommunicationnetworksautonomous vehicles.This is an example of theinto distributed cloud computing platformsmost demanding use-case among the 5G use-Located incloseproximitytotheedgedevicescases family,called ultra-high-speed low-latencyMECenableslowlatencyservicedeliveryinthecommunications.order of microseconds, over the radio accessTo support the requirements of current andnetwork(RAN),to/frommobileusersandemergingusecases,weneedtomovebeyondcur-theloT.rent5G networks,which operateatlow(sub-6The anticipated future requirementsforGHz)radiofrequencybands,towardmillimeterintroducing intelligence at the edge devices(mm)-waveones that operate athigher carrierwill lead totheevolution of MECtowardan Al-frequencies,initially at the high radio frequencyenabledplatformthatwill becapable of offer-window,i.e.,above 20 GHz,and then even highering intelligent services delivered over the fixedtoward opticalfrequenciesin thevisibleandormobile access network to the edge devices.invisibleelectromagneticwavessppectrum.2OnlySuch devices can include, in the future, compu-then,the true capabilities of 5G networks will betationallyefficientdedicatedhardwarecapablegradually revealed.In the subsegquentevolutionof running locally ML/AI Algorithms at the edgestage,thetechnologies of both5G networksanddevices.deviatingfrom theclassical conceptnext-generation internet of things (NG-loT)willof ML/Al;thelatterfocuses mainly on offlinemovetoward someform of convergence,whereand centralized Al/ML and is implemented by athe use of mm-wave frequencies will becomecloud computing model, in which the entirecommonplace.3 The NG-loT will combine technol-dataset is given a priori and is used for theogiessuchasAl,distributededgecomputing,andtraining stage.end-to-end distributed security supporting highTo obtain accurate and reliable inference, abit rates serving eMBB use cases.Besides thecentral controller divides the training datasetevolution of wireless connectivity at the edge ofintomini-batchesand allocatesthemtomultiplethe network, we anticipate major advances alsoprocessing devices.The central controller itera-onthe controland managementoftheedgenet-tively collects and aggregates their local trainingworkthat will be supported bythe introductionresults, until a cost function for the trainingof new technology approaches driven by artificialconverges. However, such a one-time trainingintelligence (AD) solutions, going well beyondprocessisvulnerabletoinitiallyunmodeledphe-what can be achieved today with software-nomena.The developed ML/AI algorithms imple-defined-networking(SDN)andnetworkfunctionsmentedoversuchcloud-computingmodelcomevirtualization(NFV).Theultimategoal will betowith a number of drawbacks, e.g., their need forenhance all the applications and vertical uselarge training datasets,their difficulty in copingcases of 5Gnetworks withthecapabilityofcogni-satisfactorily with dynamically changing envi-tion offered by cognitive computing systems thatronments,their poor performance in generaliz-use the vast amounts of available data anding their"knowledge"to new datasets withoutMachine Learning (ML)algorithms.12forgetting the previous one,and,most impor-tantly,theirhuge computational costs that makethem unsuitablefor edge computing.4FROMMOBILE-EDGE-COMPUTINGTO AI-AT-THE-EDGEDISTRIBUTEDANDFEDERATEDALThe advent of the edge computing concept,in combination with thevastly increased num-In contrast to cloud-based architectures,ber of data processing devices at the edge andrecent research investigations have sparked atheemerging 5Gnetworks,leadstoadistributedhuge interest in collaborative,distributed,low-mobileedge computing(MEC)environmentlatency,and reliable ML calling for a majorthat will enable the efficient implementationdeparture from cloud-based and centralizedof 5G usecasesinvertical markets.MEChastraining and inference, toward a novel system35January/February2020Authorizedficenseduselimitedto:IEEEXplore.DownloadedonMarch04,2021at04:06:59UTCfromIEEEXplore.Restrictionsapply

applications, such as sharing and updating high-resolution maps in real time for control of autonomous vehicles. This is an example of the most demanding use-case among the 5G use￾cases family, called ultra-high-speed low-latency communications.1 To support the requirements of current and emerging use cases, we need to move beyond cur￾rent 5G networks, which operate at low (sub-6 GHz) radio frequency bands, toward millimeter (mm)-wave ones that operate at higher carrier frequencies, initially at the high radio frequency window, i.e., above 20 GHz, and then even higher toward optical frequencies in the visible and invisible electromagnetic waves spectrum.2 Only then, the true capabilities of 5G networks will be gradually revealed. In the subsequent evolution stage, the technologies of both 5G networks and next-generation internet of things (NG-IoT) will move toward some form of convergence, where the use of mm-wave frequencies will become commonplace.3 The NG-IoT will combine technol￾ogies such as AI, distributed edge computing, and end-to-end distributed security supporting high bit rates serving eMBB use cases. Besides the evolution of wireless connectivity at the edge of the network, we anticipate major advances also on the control and management of the edge net￾work that will be supported by the introduction of new technology approaches driven by artificial intelligence (AI) solutions, going well beyond what can be achieved today with software￾defined-networking (SDN) and network functions virtualization (NFV). The ultimate goal will be to enhance all the applications and vertical use cases of 5G networks with the capability of cogni￾tion offered by cognitive computing systems that use the vast amounts of available data and Machine Learning (ML) algorithms.1,2 FROM MOBILE-EDGE-COMPUTING TO AI-AT-THE-EDGE The advent of the edge computing concept, in combination with the vastly increased num￾ber of data processing devices at the edge and the emerging 5G networks, leads to a distributed mobile edge computing (MEC) environment that will enable the efficient implementation of 5G use cases in vertical markets. MEC has gained significant momentum recently as it transforms 5G mobile communication networks into distributed cloud computing platforms. Located in close proximity to the edge devices MEC enables low latency service delivery in the order of microseconds, over the radio access network (RAN), to/from mobile users and the IoT. The anticipated future requirements for introducing intelligence at the edge devices will lead to the evolution of MEC toward an AI￾enabled platform that will be capable of offer￾ing intelligent services delivered over the fixed or mobile access network to the edge devices. Such devices can include, in the future, compu￾tationally efficient dedicated hardware capable of running locally ML/AI Algorithms at the edge devices, deviating from the classical concept of ML/AI; the latter focuses mainly on offline and centralized AI/ML and is implemented by a cloud computing model, in which the entire dataset is given a priori and is used for the training stage. To obtain accurate and reliable inference, a central controller divides the training dataset into mini-batches and allocates them to multiple processing devices. The central controller itera￾tively collects and aggregates their local training results, until a cost function for the training converges. However, such a one-time training process is vulnerable to initially unmodeled phe￾nomena.The developed ML/AI algorithms imple￾mented over such cloud-computing model come with a number of drawbacks, e.g., their need for large training datasets, their difficulty in coping satisfactorily with dynamically changing envi￾ronments, their poor performance in generaliz￾ing their “knowledge” to new datasets without forgetting the previous one, and, most impor￾tantly, their huge computational costs that make them unsuitable for edge computing.4 DISTRIBUTED AND FEDERATED AI In contrast to cloud-based architectures, recent research investigations have sparked a huge interest in collaborative, distributed, low￾latency, and reliable ML calling for a major departure from cloud-based and centralized training and inference, toward a novel system January/February 2020 35 Authorized licensed use limited to: IEEE Xplore. Downloaded on March 04,2021 at 04:06:59 UTC from IEEE Xplore. Restrictions apply

EnvisioningOurFuture Digital World9品anFay NolInfrastructureCortentproviderdeiveryFig NadeApp and function=MAw SenesservicesDatahfrastructure量torag-providerESmal-celFigeMbrEigMod:hdustry Edge nodeLoInfrastructurOa0P1e4rSmart homeSmart eityMacrna-cellAAop Senersprovider80口CPEsDas Sanwrs服TWRtOm.自=UB→:F兴过80命M口口吧蓝复潮山2P0M即PAE80#馨BRobeteonfeSasassSmart hemeSmart chtyCAM. PORIndustry 4.0Mebie End ustSmart afice/enterpristFigure.1.Diverse setofserviceprovisioningfields addressedbytheedgecomputingconcept inageneric 6Gnetworkmodelattheaccesswithinterconnectiontothelegacycentralizededgeprocessingnodeandclouddesign coined the term"Al at the Edge,"in whichprocessing capabilities that are eitheredge nodeswe have the following:D) Training data is(e.g., industrial or enterprise nodes, smart cityunevenly distributed over a large number ofservers,smarthomeservers,customerpremisesedge devices,suchas networkbasestationsequipment (CPEs0,and private infrastructure(BSs)and/ormobiledevices,includingphones,gateways)or network services nodes (e.g.,mini-cameras, vehicles, drones, etc.; ii) every edgedata center (DCs),application servers, contentdevicehas access to a fraction of thedata and adelivery,and data storage nodes).limited computation and storagepower,whereTheconcept relies on the dynamic forma-edgedevicesexchangetheirlocallytrainedmod-tion of collaborative edgecomputing domainsels,instead ofexchanging their private data and(ECDs) at or close to the end user thataresmartly interconnected andmanaged overthewherein training and inference are carried outmetro-access wireless-optical infrastructure.collectively:and finally,iii) data abstraction,cleansing, and dimensionality reduction of netThe EcDs can be composed by one or severaledge nodes, with a diverse set of processingworkdatabecomevital,asthemassiveamountof monitored data cannot bestored.capabilities (i.e.,from simple RaspberryPis toIt becomes evident that Al atthe edge is anano-/mini-DCs)that can providetherequirednascentresearch field,whose system design iscomputational resources and datapools tohan-entangled with secure and reliable communica-dle a set of running services. As the servicetions and on-device resource constraints (i.e.,requirements vary, additional resources,data,energy,memory,and computingpower).ThisorapplicationsmaybeneededfromotherECDs(ie., service expansion).The overall architec-architectureallowsthedeploymentofanevolved5G/6G networkthat canoperateas adistributedtureis depicted inFigure1.Enabling ML at the network edge introducescomputerwiththegoaltoofferAlattheedgeandwhich will be deployed between thecloud proc-novel fundamental researchproblems intermsessing level (the network core)and the con-of jointly optimizing training,communication,nected end users/devices (at the wirelessandcontrolunderend-to-end(E2E)latency,reli-access).Itwillbeaplatformcomposedbydiverseability,security,privacy,and trustworthinesstypes of nodes and infrastructures with differentspecifications, as well as edge devices'hardware36ITProfessionalAuthorized licensed use limited to: IEEE Xplore. Downloaded on March 04,2021 at 04:06:59 UTC from IEEE Xplore. Restictions apply

design coined the term “AI at the Edge,” in which we have the following: i) Training data is unevenly distributed over a large number of edge devices, such as network base stations (BSs) and/or mobile devices, including phones, cameras, vehicles, drones, etc.; ii) every edge device has access to a fraction of the data and a limited computation and storage power, where edge devices exchange their locally trained mod￾els, instead of exchanging their private data and wherein training and inference are carried out collectively; and finally, iii) data abstraction, cleansing, and dimensionality reduction of net￾work data become vital, as the massive amount of monitored data cannot be stored. It becomes evident that AI at the edge is a nascent research field, whose system design is entangled with secure and reliable communica￾tions and on-device resource constraints (i.e., energy, memory, and computing power). This architecture allows the deployment of an evolved 5G/6G network that can operate as a distributed computer with the goal to offer AI at the edge and which will be deployed between the cloud proc￾essing level (the network core) and the con￾nected end users/devices (at the wireless access). It will be a platform composed by diverse types of nodes and infrastructures with different processing capabilities that are either edge nodes (e.g., industrial or enterprise nodes, smart city servers, smart home servers, customer premises equipment (CPEs0, and private infrastructure gateways) or network services nodes (e.g., mini￾data center (DCs), application servers, content delivery, and data storage nodes). The concept relies on the dynamic forma￾tion of collaborative edge computing domains (ECDs) at or close to the end user that are smartly interconnected and managed over the metro-access wireless-optical infrastructure. The ECDs can be composed by one or several edge nodes, with a diverse set of processing capabilities (i.e., from simple Raspberry Pis to nano-/mini-DCs) that can provide the required computational resources and data pools to han￾dle a set of running services. As the service requirements vary, additional resources, data, or applications may be needed from other ECDs (i.e., service expansion). The overall architec￾ture is depicted in Figure 1. Enabling ML at the network edge introduces novel fundamental research problems in terms of jointly optimizing training, communication, and control under end-to-end (E2E) latency, reli￾ability, security, privacy, and trustworthiness specifications, as well as edge devices’ hardware Figure. 1. Diverse set of service provisioning fields addressed by the edge computing concept in a generic 6G network model at the access with interconnection to the legacy centralized edge processing node and cloud. Envisioning Our Future Digital World 36 IT Professional Authorized licensed use limited to: IEEE Xplore. Downloaded on March 04,2021 at 04:06:59 UTC from IEEE Xplore. Restrictions apply

requirements.For practical edge ML,onlineothermachines may bedesigned to exploit cog-decentralized training can simplify the imple-nitiveprocessing performed remotely frommentation and preserve privacy by exchangingthem,ataplatform/serverthatwill becollocatednot only the entire datasetbut themodel-relatedwitha6GBSandwhichwillbeinwirelesscon-nectionwitheachother.2parameters with (or without)a simple parame-Furthermore, exploiting edge ML for improv-ter aggregator,thereby reflecting a huge volumeofuser-generated data samples in real timeingtheperformanceofcommunicationnetworksA major effort on the"Al at the edge"frontis another research area that epitomizes theresearchdirection of MLCwhereinMLcanbeistodevelopefficientschemesfor distributedoptimization among a large numberof devicesusedtooptimizetheoperationofcommunicationsystems and networks.5An intelligent Al-enabledthat sharea common model to be trained. viadata that are distributed in a large number offuture6G mobile network infrastructure willinterconnected devices and by utilizing certainmakethebest use of the physical and computingmodels.5 Such an interesting ML training archi-resources at theedgeofthenetworktooffer intel-tecturefollowinga decentralizedprocessingligentsystemssupportinganentirelynewgenera-tion of services. The ML/AI algorithms may alsorationale is the federated learning approach,in which mobiledevicesperiodically exchangeassistthe software-defined controland manage-their neural network (NN) weights and gradientsmentplane of the communication network induring local training.order to, e.g., i) predict the required resources,based onanumberofintrinsic networkingparam-eter or external (e.g.,environmental, social,etc.)UTILIZINGCOMMUNICATIONSFORfactors, ii) identify potential physical failures,ML(CML)ANDMLFORCOMMUNICATIONS(MLC)INTHEdevice malfunctions,and evenmalicious attacksFUTURE6GSMARTNETWORKSand usageand ii)dynamicallyvarythe smart ser-From the aforementioned discussions,itvicelevel agreement (SLAs), or any other func-becomesclearthathighspeed, lowlatency,andtionalities that would benefit fromAl forfuturereliable communications are essential for sup-use.The research on CML and MLC are closelyporting ML/Al at the edge; giving rise to therelated and interlinked. These two research direc-researchfield entitled CMLGoing a stepfurthertions(i.e.,MLCandCML)willbeinstrumental infrom what has been discussed in the previousspearheading the vision of truly intelligent nextgeneration communications systems (i.e.,the 6Gsection, we note that the use of the next genera-tion of wireless connectivity at the edge of themobilenetworks).networkcan enablefarmoreadvanced conceptsCONCLUSIONthan the described Al at the edge.The smart 6G networks of the future will inte-Future generations of wireless communica-tionsystems(allocatinginexcessof10GHzRFgrate intelligent 5Gand NG-loT connectivity infra-channels in the THz and optical frequenciesstructures,withadvancededge-computingregime, while exploiting modulation methodshardwarethatwillsupportthecomputational-capable of 10 b/symbol), can achieve data ratesresources'heavy execution of theAl algorithms.of about 100 Tb/s.2With suchassumptions inThe introduction of Al at the network edgefacesmind,one canenvision as plausibleinthefuturemany challenges related to the fact that its tar-the realization of the concept of wireless cogni-geted applications must operate in real-timetion, that is, to provide a wireless communica-dynamic environments,which should complytions linkthat can enable massive computationswith strict end-to-end bit-rate, latency, reliability,(matching thecapabilities of thehumanbrain)privacy/security requirements, as well as edgeto be conducted remotely from the device ordevices'limitations like energy availability,bat-machine that is undertaking real-time intelligenttery lifetime, and memory size.In this 6Gofmobileactions at theedge ofthenetwork.If thatvisionnetwork,the communications infrastructure con-indeed becomes a reality,then, in the future,nectingtheforeseenbillionsofsmartedgedevicesrobots,autonomous vehicles,loT devices,and(collecting and distributing data) should be seen37January/February2020Authorizedlicensed uselimited to:IEEEXplore.Downloadedon March04,2021at04:06:59UTCfromIEEEXplore.Restrictionsapply

requirements. For practical edge ML, online decentralized training can simplify the imple￾mentation and preserve privacy by exchanging not only the entire dataset but the model-related parameters with (or without) a simple parame￾ter aggregator, thereby reflecting a huge volume of user-generated data samples in real time. A major effort on the “AI at the edge” front is to develop efficient schemes for distributed optimization among a large number of devices that share a common model to be trained, via data that are distributed in a large number of interconnected devices and by utilizing certain models.5 Such an interesting ML training archi￾tecture following a decentralized processing rationale is the federated learning approach, in which mobile devices periodically exchange their neural network (NN) weights and gradients during local training. UTILIZING COMMUNICATIONS FOR ML (CML) AND ML FOR COMMUNICATIONS (MLC) IN THE FUTURE 6G SMART NETWORKS From the aforementioned discussions, it becomes clear that high speed, low latency, and reliable communications are essential for sup￾porting ML/AI at the edge; giving rise to the research field entitled CML.1 Going a step further from what has been discussed in the previous section, we note that the use of the next genera￾tion of wireless connectivity at the edge of the network can enable far more advanced concepts than the described AI at the edge. Future generations of wireless communica￾tion systems (allocating in excess of 10 GHz RF channels in the THz and optical frequencies regime, while exploiting modulation methods capable of 10 b/symbol), can achieve data rates of about 100 Tb/s.2 With such assumptions in mind, one can envision as plausible in the future the realization of the concept of wireless cogni￾tion, that is, to provide a wireless communica￾tions link that can enable massive computations (matching the capabilities of the human brain) to be conducted remotely from the device or machine that is undertaking real-time intelligent actions at the edge of the network. If that vision indeed becomes a reality, then, in the future, robots, autonomous vehicles, IoT devices, and other machines may be designed to exploit cog￾nitive processing performed remotely from them, at a platform/server that will be collocated with a 6G BS and which will be in wireless con￾nection with each other.2 Furthermore, exploiting edge ML for improv￾ing the performance of communication networks is another research area that epitomizes the research direction of MLC,1 wherein ML can be used to optimize the operation of communication systems and networks.5 An intelligent AI-enabled future 6G mobile network infrastructure will make the best use of the physical and computing resources at the edge of the network to offer intel￾ligent systems supporting an entirely new genera￾tion of services. The ML/AI algorithms may also assist the software-defined control and manage￾ment plane of the communication network in order to, e.g., i) predict the required resources, based on a number of intrinsic networking param￾eter or external (e.g., environmental, social, etc.) factors, ii) identify potential physical failures, device malfunctions, and even malicious attacks and usage, and iii) dynamically vary the smart ser￾vice level agreement (SLAs), or any other func￾tionalities that would benefit from AI for future use. The research on CML and MLC are closely related and interlinked. These two research direc￾tions (i.e., MLC and CML) will be instrumental in spearheading the vision of truly intelligent next generation communications systems (i.e., the 6G mobile networks).1 CONCLUSION The smart 6G networks of the future will inte￾grate intelligent 5G and NG-IoT connectivity infra￾structures, with advanced edge-computing hardware that will support the computational￾resources’ heavy execution of the AI algorithms. The introduction of AI at the network edge faces many challenges related to the fact that its tar￾geted applications must operate in real-time dynamic environments, which should comply with strict end-to-end bit-rate, latency, reliability, privacy/security requirements, as well as edge devices’ limitations like energy availability, bat￾tery lifetime, and memory size. In this 6G of mobile network, the communications infrastructure con￾necting the foreseen billions of smart edge devices (collecting and distributing data) should be seen January/February 2020 37 Authorized licensed use limited to: IEEE Xplore. Downloaded on March 04,2021 at 04:06:59 UTC from IEEE Xplore. Restrictions apply

EnvisioningOurFutureDigitalWorldloannisTomkos is currently a Professorof Opticalas its"nervous system,"whilethe edge computingCommunicationswiththeDepartmentof ElectricalhardwareexecutingAl and otheralgorithms (like,and Computer Engineering,University of Patrase.g., blockchain for trustworthiness and security)Patras, Greece. In the past, he was Professor andover an SDN/NFV software control and manage-ResearchDirectorofNextGenerationNetworkswithment platform should be seen as its“brain."SuchAIT (Athens,Greece).His"High Speed Networksandevolution, combined with the proliferationofOptical Communication (NOC)"Research Groupsmart (and wearable) wireless devices at the edge,was/is involved inmorethan25EU-funded researchwill make the internet of everything (loE) a reality.projects witha consortium-wideleading role.He isaThis highlights our visionfor the evolution of cur-FellowoftheIEEE,aFellowoftheIETandaFellowofthe"Optical Society."Together with his colleaguesrent5G andloTinfrastructurestowardthesmartand students,he has authoredmore than 650peer-converged 6G infrastructure that is expected toreviewedarchivalarticles.Contacthimatitom@ait.gr.emergeafter2025,supportinginparalleltheactualdeployment of the fourth industrial revolutionDimitriosKlonidiscurrentlytheHeadoftheNet-(lndustry 4.0),autonomousdriving.smart city/work Softwarization and loT Unit with Ubitech.Hebuilding services, AR services,and more that willreceived thePh.D.degreeinopticalcommunicationsbe able to reshape our way of living.fromEssexUniversity.Colchester,U.K.,in2006.Hehas been involved in numerous EU-funded researchREFERENCESprojects,gainingexpertiseinvariousareas,including二optical network planningcontrol and performance1. J. Park, S. Samarakoon, M. Bennis, and M. Debbah,evaluation.Hehas coauthoredmorethan150publi-"Wireless network intelligenceattheedge,"2018,cations in international journals and conferences.arXiv:1812.02858Contacthimatdklonidis@ubitech.eu.2. T. S. Rappaport et al., "Wireless communications andEvangelosPikasis iscurrentlyaDSPandTelecomapplicationsabove100GHz:OpportunitiesandEngineer with EULAMBiA Advanced Technologieschallenges for 6G and beyond,"IEEE Access, vol.7Ltd., Greece. He received the Ph.D. degree in thepp.78729-78757,2019.areaofoptical communicationsfromtheNationaland3. M. H. Mazaheri, S. Ameli, A. Abedi, and O. Abari, "AKapodistrian University ofAthens,Athens,Greece,millimeter wave network for billions of things. inin2014.His area of expertise includesoptical commu-Proc.ACMSpecial InterestGroupDataCommun.,nication systems, analog radio over fiber, advan-2019.ced modulation schemes, digital signal processing,4.S.Theodoridis,MachineLearning:A BayesianandFPGAs, and cryptography at the physical layer.OptimizationPerspective.2nded.SanFrancisco,CA,Contacthimatevangelos.pikasis@eulambia.com.USA:Academic,2020.Sergios Theodoridis is Currently a Professor of5.J.Plata-Chaves,M.Moonen,A.Bertrand,Signal Processing and Machine Learning at NationalS. Theodoridis, and A. M. Zoubir, "Heterogeneous andandKapodistrian UniversityofAthens,Greece.He ismultitaskwirelesssensornetworks—Algorithmsthe author of the book Machine Learning:ABayes-applications,andchallenges,"IEEEJ.Sel.Topics,ian and Optimization Perspective(Academic,2ndvol.11,no.3,pp.450465,Apr.2017ed.,2020).Hereceived the2017EURASiPAthana-6. S.Dorner, S.Cammerer, J.Hoydis, and S.ten Brink,siosPapoulisAward,the2014IEEESignalProcess-"Deep learning-based communication over the air,ingSocietyEducation Award,and the2014IEEEJ.Sel.TopicsSignalProcess.,vol.12,no.1EURASIP Meritorious Service Award.He currentlypp. 132143, Feb.2018.serves as VicePresidentofthe IEEESignalProcess-ing Society.He is a Fellow of IET,a CorrespondingFellowoftheRoyal Societyof Edinburgh,aFellowofEURASIP,and a Fellow of IEEE.Contact him atstheodor@di.uoa.gr.38ITProfessionalAuthorized licensed use limited to: IEEEXplore. Downloaded on March 04,2021 at 04:06:59 UTC from IEEEXplore.Restrictions apply

as its “nervous system,” while the edge computing hardware executing AI and other algorithms (like, e.g., blockchain for trustworthiness and security) over an SDN/NFV software control and manage￾ment platform should be seen as its “brain.” Such evolution, combined with the proliferation of smart (and wearable) wireless devices at the edge, will make the internet of everything (IoE) a reality. This highlights our vision for the evolution of cur￾rent 5G and IoT infrastructures toward the smart converged 6G infrastructure that is expected to emerge after 2025, supporting in parallel the actual deployment of the fourth industrial revolution (Industry 4.0), autonomous driving, smart city/ building services, AR services, and more that will be able to reshape our way of living. & REFERENCES 1. J. Park, S. Samarakoon, M. Bennis, and M. Debbah, “Wireless network intelligence at the edge,” 2018, arXiv:1812.02858. 2. T. S. Rappaport et al., “Wireless communications and applications above 100 GHz: Opportunities and challenges for 6G and beyond,” IEEE Access, vol. 7, pp. 78 729–78 757, 2019. 3. M. H. Mazaheri, S. Ameli, A. Abedi, and O. Abari, “A millimeter wave network for billions of things.” in Proc. ACM Special Interest Group Data Commun., 2019. 4. S. Theodoridis, Machine Learning: A Bayesian and Optimization Perspective. 2nd ed. San Francisco, CA, USA: Academic, 2020. 5. J. Plata-Chaves, A. Bertrand, M. Moonen, S. Theodoridis, and A. M. Zoubir, “Heterogeneous and multitask wireless sensor networks—Algorithms, applications, and challenges,” IEEE J. Sel. Topics , vol. 11, no. 3, pp. 450–465, Apr. 2017. 6. S. Dorner, S. Cammerer, J. Hoydis, and S. ten Brink, € “Deep learning-based communication over the air,” IEEE J. Sel. Topics Signal Process., vol. 12, no. 1, pp. 132–143, Feb. 2018. Ioannis Tomkos is currently a Professor of Optical Communications with the Department of Electrical and Computer Engineering, University of Patras, Patras, Greece. In the past, he was Professor and Research Director of Next Generation Networks with AIT (Athens, Greece). His “High Speed Networks and Optical Communication (NOC)” Research Group was/is involved in more than 25 EU-funded research projects with a consortium-wide leading role. He is a Fellow of the IEEE, a Fellow of the IET and a Fellow of the “Optical Society.” Together with his colleagues and students, he has authored more than 650 peer￾reviewed archival articles. Contact him at itom@ait.gr. Dimitrios Klonid is currently the Head of the Net￾work Softwarization and IoT Unit with Ubitech. He received the Ph.D. degree in optical communications from Essex University, Colchester, U.K., in 2006. He has been involved in numerous EU-funded research projects, gaining expertise in various areas, including optical network planning control and performance evaluation. He has coauthored more than 150 publi￾cations in international journals and conferences. Contact him at dklonidis@ubitech.eu. Evangelos Pikasis is currently a DSP and Telecom Engineer with EULAMBIA Advanced Technologies Ltd., Greece. He received the Ph.D. degree in the area of optical communications from the National and Kapodistrian University of Athens, Athens, Greece, in 2014. His area of expertise includes optical commu￾nication systems, analog radio over fiber, advan￾ced modulation schemes, digital signal processing, FPGAs, and cryptography at the physical layer. Contact him at evangelos.pikasis@eulambia.com. Sergios Theodoridis is currently a Professor of Signal Processing and Machine Learning at National and Kapodistrian University of Athens, Greece. He is the author of the book Machine Learning: A Bayes￾ian and Optimization Perspective (Academic, 2nd ed., 2020). He received the 2017 EURASIP Athana￾sios Papoulis Award, the 2014 IEEE Signal Process￾ing Society Education Award, and the 2014 EURASIP Meritorious Service Award. He currently serves as Vice President of the IEEE Signal Process￾ing Society. He is a Fellow of IET, a Corresponding Fellow of the Royal Society of Edinburgh, a Fellow of EURASIP, and a Fellow of IEEE. Contact him at stheodor@di.uoa.gr. Envisioning Our Future Digital World 38 IT Professional Authorized licensed use limited to: IEEE Xplore. Downloaded on March 04,2021 at 04:06:59 UTC from IEEE Xplore. Restrictions apply

已到末页,全文结束
刷新页面下载完整文档
VIP每日下载上限内不扣除下载券和下载次数;
按次数下载不扣除下载券;
注册用户24小时内重复下载只扣除一次;
顺序:VIP每日次数-->可用次数-->下载券;
相关文档