Georgios B. Giannakis

Georgios  B. Giannakis

University of Minnesota-Twin Cities

H-index: 160

North America-United States

Description

Georgios B. Giannakis, With an exceptional h-index of 160 and a recent h-index of 65 (since 2020), a distinguished researcher at University of Minnesota-Twin Cities, specializes in the field of Signal processing, machine learning, wireless communications, power systems, network science.

Professor Information

University

University of Minnesota-Twin Cities

Position

Endowed Chair Prof. Dept. of ECE and DTC

Citations(all)

91193

Citations(since 2020)

19875

Cited By

80294

hIndex(all)

160

hIndex(since 2020)

65

i10Index(all)

795

i10Index(since 2020)

356

Email

University Profile Page

University of Minnesota-Twin Cities

Research & Interests List

Signal processing

machine learning

wireless communications

power systems

network science

Top articles of Georgios B. Giannakis

Meta-Learning With Versatile Loss Geometries for Fast Adaptation Using Mirror Descent

Utilizing task-invariant prior knowledge extracted from related tasks, meta-learning is a principled framework that empowers learning a new task especially when data records are limited. A fundamental challenge in meta-learning is how to quickly "adapt" the extracted prior in order to train a task-specific model within a few optimization steps. Existing approaches deal with this challenge using a preconditioner that enhances convergence of the per-task training process. Though effective in representing locally a quadratic training loss, these simple linear preconditioners can hardly capture complex loss geometries. The present contribution addresses this limitation by learning a nonlinear mirror map, which induces a versatile distance metric to enable capturing and optimizing a wide range of loss geometries, hence facilitating the per-task training. Numerical tests on few-shot learning datasets demonstrate the superior …

Authors

Yilang Zhang,Bingcong Li,Georgios B Giannakis

Published Date

2024/4/14

Learning Robust to Distributional Uncertainties and Adversarial Data

Successful training of data-intensive deep neural networks critically rely on vast, clean, and high-quality datasets. In practice however, their reliability diminishes, particularly with noisy, outlier-corrupted data samples encountered in testing. This challenge intensifies when dealing with anonymized, heterogeneous data sets stored across geographically distinct locations due to e.g., privacy concerns. This present paper introduces robust learning frameworks tailored for centralized and federated learning scenarios. Our goal is to fortify model resilience with a focus that lies in (i) addressing distribution shifts from training to inference time; and, (ii) ensuring test-time robustness, when a trained model may encounter outliers or adversarially contaminated test data samples. To this aim, we start with a centralized setting where the true data distribution is considered unknown, but residing within a Wasserstein ball centered …

Authors

Alireza Sadeghi,Gang Wang,Georgios B Giannakis

Journal

IEEE Journal on Selected Areas in Information Theory

Published Date

2024/3/26

A Bayesian Approach to High-Order Link Prediction

Using a subset of observed network links, high-order link prediction (HOLP) infers missing hyperedges, that is links connecting three or more nodes. HOLP emerges in several applications, but existing approaches have not dealt with the associated predictor’s performance. To overcome this limitation, the present contribution develops a Bayesian approach and the relevant predictive distributions that quantify model uncertainty. Gaussian processes model the dependence of each node to the remaining nodes. These nonparametric models yield predictive distributions, which are fused across nodes by means of a pseudo-likelihood based criterion. Performance is quantified by proper measures of dispersion, which are associated with the predictive distributions. Tests on benchmark datasets demonstrate the benefits of the novel approach.

Authors

Georgios V Karanikolas,Alba Pagès-Zamora,Georgios B Giannakis

Published Date

2024/4/14

Enhancing sharpness-aware optimization through variance suppression

Sharpness-aware minimization (SAM) has well documented merits in enhancing generalization of deep neural networks, even without sizable data augmentation. Embracing the geometry of the loss function, where neighborhoods of'flat minima'heighten generalization ability, SAM seeks' flat valleys' by minimizing the maximum loss caused by an adversary perturbing parameters within the neighborhood. Although critical to account for sharpness of the loss function, such an'over-friendly adversary'can curtail the outmost level of generalization. The novel approach of this contribution fosters stabilization of adversaries through variance suppression (VaSSO) to avoid such friendliness. VaSSO's provable stability safeguards its numerical improvement over SAM in model-agnostic tasks, including image classification and machine translation. In addition, experiments confirm that VaSSO endows SAM with robustness against high levels of label noise.

Authors

Bingcong Li,Georgios Giannakis

Journal

Advances in Neural Information Processing Systems

Published Date

2024/2/13

Bayesian Self-Supervised Learning Using Local and Global Graph Information

Graph-guided learning has well-documented impact in a gamut of network science applications. A prototypical graph-guided learning task deals with semi-supervised learning over graphs, where the goal is to predict the nodal values or labels of unobserved nodes, by leveraging a few nodal observations along with the underlying graph structure. This is particularly challenging under privacy constraints or generally when acquiring nodal observations incurs high cost. In this context, the present work puts forth a Bayesian graph-driven self-supervised learning (Self-SL) approach that: (i) learns powerful nodal embeddings emanating from easier to solve auxiliary tasks that map local to global connectivity information; and, (ii) adopts an ensemble of Gaussian processes (EGPs) with adaptive weights as nodal embeddings are processed online. Unlike most existing deterministic approaches, the novel approach offers …

Authors

Konstantinos D. Polyzos,Alireza. Sadeghi,Georgios. B. Giannakis

Published Date

2023

Integrated Distributed Wireless Sensing with Over-The-Air Federated Learning

Over-the-air federated learning (OTA-FL) is a communication-effective approach for achieving distributed learning tasks. In this paper, we aim to enhance OTA-FL by seamlessly combining sensing into the communication-computation integrated system. Our research reveals that the wireless waveform used to convey OTA-FL parameters possesses inherent properties that make it well-suited for sensing, thanks to its remarkable auto-correlation characteristics. By leveraging the OTA-FL learning statistics, i.e., means and variances of local gradients in each training round, the sensing results can be embedded therein without the need for additional time or frequency resources. Finally, by considering the imperfections of learning statistics that are neglected in the prior works, we end up with an optimized the transceiver design to maximize the OTA-FL performance. Simulations validate that the proposed method not only …

Authors

Shijian Gao,Jia Yan,Georgios B Giannakis

Published Date

2023/7/16

Physics-informed transfer learning for voltage stability margin prediction

Assessing set-membership and evaluating distances to the related set boundary are problems of widespread interest, and can often be computationally challenging. Seeking efficient learning models for such tasks, this paper deals with voltage stability margin prediction for power systems. Supervised training of such models is conventionally hard due to high-dimensional feature space, and a cumbersome label-generation process. Nevertheless, one may find related easy auxiliary tasks, such as voltage stability verification, that can aid in training for the hard task. This paper develops a novel approach for such settings by leveraging transfer learning. A Gaussian process-based learning model is efficiently trained using learning- and physics-based auxiliary tasks. Numerical tests demonstrate markedly improved performance that is harnessed alongside the benefit of uncertainty quantification to suit the needs of the …

Authors

Manish K Singh,Konstantinos D Polyzos,Panagiotis A Traganitis,Sairaj V Dhople,Georgios B Giannakis

Published Date

2023/6/4

Contractive error feedback for gradient compression

On-device memory concerns in distributed deep learning have become severe due to (i) the growth of model size in multi-GPU training, and (ii) the wide adoption of deep neural networks for federated learning on IoT devices which have limited storage. In such settings, communication efficient optimization methods are attractive alternatives, however they still struggle with memory issues. To tackle these challenges, we propose an communication efficient method called contractive error feedback (ConEF). As opposed to SGD with error-feedback (EFSGD) that inefficiently manages memory, ConEF obtains the sweet spot of convergence and memory usage, and achieves communication efficiency by leveraging biased and all-reducable gradient compression. We empirically validate ConEF on various learning tasks that include image classification, language modeling, and machine translation and observe that ConEF saves 80\% - 90\% of the extra memory in EFSGD with almost no loss on test performance, while also achieving 1.3x - 5x speedup of SGD. Through our work, we also demonstrate the feasibility and convergence of ConEF to clear up the theoretical barrier of integrating ConEF to popular memory efficient frameworks such as ZeRO-3.

Authors

Bingcong Li,Shuai Zheng,Parameswaran Raman,Anshumali Shrivastava,Georgios B Giannakis

Journal

arXiv preprint arXiv:2312.08538

Published Date

2023/12/13

Professor FAQs

What is Georgios B. Giannakis's h-index at University of Minnesota-Twin Cities?

The h-index of Georgios B. Giannakis has been 65 since 2020 and 160 in total.

What are Georgios B. Giannakis's research interests?

The research interests of Georgios B. Giannakis are: Signal processing, machine learning, wireless communications, power systems, network science

What is Georgios B. Giannakis's total number of citations?

Georgios B. Giannakis has 91,193 citations in total.

What are the co-authors of Georgios B. Giannakis?

The co-authors of Georgios B. Giannakis are Alejandro Ribeiro, Nikolaos Sidiropoulos, Shengli Zhou, Sergio Barbarossa, Anna Scaglione, Liuqing Yang.

Co-Authors

H-index: 71
Alejandro Ribeiro

Alejandro Ribeiro

University of Pennsylvania

H-index: 71
Nikolaos Sidiropoulos

Nikolaos Sidiropoulos

University of Virginia

H-index: 68
Shengli Zhou

Shengli Zhou

University of Connecticut

H-index: 63
Sergio Barbarossa

Sergio Barbarossa

Sapienza Università di Roma

H-index: 63
Anna Scaglione

Anna Scaglione

Arizona State University

H-index: 62
Liuqing Yang

Liuqing Yang

University of Minnesota-Twin Cities

academic-engine

Useful Links