Dec 2024 • arXiv preprint arXiv:2312.02102

Or Shalom, Amir Leshem, Waheed U Bajwa

AL

Federated learning is a technique that allows multiple entities to collaboratively train models using their data without compromising data privacy. However, despite its advantages, federated learning can be susceptible to false data injection attacks. In these scenarios, a malicious entity with control over specific agents in the network can manipulate the learning process, leading to a suboptimal model. Consequently, addressing these data injection attacks presents a significant research challenge in federated learning systems. In this paper, we propose a novel technique to detect and mitigate data injection attacks on federated learning systems. Our mitigation method is a local scheme, performed during a single instance of training by the coordinating node, allowing the mitigation during the convergence of the algorithm. Whenever an agent is suspected to be an attacker, its data will be ignored for a certain period, this decision will often be re-evaluated. We prove that with probability 1, after a finite time, all attackers will be ignored while the probability of ignoring a trustful agent becomes 0, provided that there is a majority of truthful agents. Simulations show that when the coordinating node detects and isolates all the attackers, the model recovers and converges to the truthful model.

Show moreDec 2024 • arXiv e-prints

Ram Dyuthi Sristi, Ofir Lindenbaum, Maria Lavzin, Jackie Schiller, Gal Mishne, Hadas Benisty

OL

We study the problem of contextual feature selection, where the goal is to learn a predictive function while identifying subsets of informative features conditioned on specific contexts. Towards this goal, we generalize the recently proposed stochastic gates (STG) Yamada et al.[2020] by modeling the probabilistic gates as conditional Bernoulli variables whose parameters are predicted based on the contextual variables. Our new scheme, termed conditional-STG (c-STG), comprises two networks: a hypernetwork that establishes the mapping between contextual variables and probabilistic feature selection parameters and a prediction network that maps the selected feature to the response variable. Training the two networks simultaneously ensures the comprehensive incorporation of context and feature selection within a unified model. We provide a theoretical analysis to examine several properties of the proposed …

Show moreDec 2024 • arXiv preprint arXiv:2312.13240

Amit Rozner, Barak Battash, Ofir Lindenbaum, Lior Wolf

OL

We study the problem of performing face verification with an efficient neural model . The efficiency of stems from simplifying the face verification problem from an embedding nearest neighbor search into a binary problem; each user has its own neural network . To allow information sharing between different individuals in the training set, we do not train directly but instead generate the model weights using a hypernetwork . This leads to the generation of a compact personalized model for face identification that can be deployed on edge devices. Key to the method's success is a novel way of generating hard negatives and carefully scheduling the training objectives. Our model leads to a substantially small requiring only 23k parameters and 5M floating point operations (FLOPS). We use six face verification datasets to demonstrate that our method is on par or better than state-of-the-art models, with a significantly reduced number of parameters and computational burden. Furthermore, we perform an extensive ablation study to demonstrate the importance of each element in our method.

Show moreDec 2024 • Quantum Science and Technology

Rafael Wagner, Zohar Schwartzman-Nowik, Ismael Lucas Paiva, Amit Te'eni, Antonio Ruiz-Molero, Rui Soares Barbosa, Eliahu Cohen, Ernesto Galvão

EC

Weak values and Kirkwood--Dirac (KD) quasiprobability distributions have been independently associated with both foundational issues in quantum theory and advantages in quantum metrology. We propose simple quantum circuits to measure weak values, KD distributions, and spectra of density matrices without the need for post-selection. This is achieved by measuring unitary-invariant, relational properties of quantum states, which are functions of Bargmann invariants, the concept that underpins our unified perspective. Our circuits also enable experimental implementation of various functions of KD distributions, such as out-of-time-ordered correlators (OTOCs) and the quantum Fisher information in post-selected parameter estimation, among others. An upshot is a unified view of nonclassicality in all those tasks. In particular, we discuss how negativity and imaginarity of Bargmann invariants relate to set coherence.

Show moreDec 2024 • Intelligent Systems with Applications

Ohad Volk, Gonen Singer

GS

We design an adaptive learning algorithm for binary classification problems whose objective is to reduce the cost of misclassified instances derived from the consequences of errors. Our algorithm (Adaptive Cost-Sensitive Learning — AdaCSL) adaptively adjusts the loss function to bridge the difference between the class distributions between subgroups of samples in the training and validation data sets. This adjustment is made for samples with similar predicted probabilities, in such a way that the local cost decreases. This process usually leads to a reduction in cost when applied to the test data set (i.e., local training–test class distributions mismatch). We present empirical evidence that neural networks used with the proposed algorithm yields better cost results on several data sets compared to other approaches. In addition, the proposed AdaCSL algorithm can optimize evaluation metrics other than cost. We …

Show moreNov 2024 • bioRxiv

Yaron Trink, Achia Urbach, Benjamin Dekel, Peter Hohenstein, Jacob Goldberger, Tomer Kalisky

JG

The significant heterogeneity of Wilms’ tumors between different patients is thought to arise from genetic and epigenetic distortions that occur during various stages of fetal kidney development in a way that is poorly understood. To address this, we characterized the heterogeneity of alternative mRNA splicing in Wilms’ tumors using a publicly available RNAseq dataset of high-risk Wilms’ tumors and normal kidney samples. Through Pareto task inference and cell deconvolution, we found that the tumors and normal kidney samples are organized according to progressive stages of kidney development within a triangle-shaped region in latent space, whose vertices, or “archetypes,” resemble the cap mesenchyme, the nephrogenic stroma, and epithelial tubular structures of the fetal kidney. We identified a set of genes that are alternatively spliced between tumors located in different regions of latent space and found that many of these genes are associated with the Epithelial to Mesenchymal Transition (EMT) and muscle development. Using motif enrichment analysis, we identified putative splicing regulators, some of which are associated with kidney development. Our findings provide new insights into the etiology of Wilms’ tumors and suggest that specific splicing mechanisms in early stages of development may contribute to tumor development in different patients.

Show moreNov 2024 • bioRxiv

Yaron Trink, Achia Urbach, Benjamin Dekel, Peter Hohenstein, Jacob Goldberger, Tomer Kalisky

TK

The significant heterogeneity of Wilms’ tumors between different patients is thought to arise from genetic and epigenetic distortions that occur during various stages of fetal kidney development in a way that is poorly understood. To address this, we characterized the heterogeneity of alternative mRNA splicing in Wilms’ tumors using a publicly available RNAseq dataset of high-risk Wilms’ tumors and normal kidney samples. Through Pareto task inference and cell deconvolution, we found that the tumors and normal kidney samples are organized according to progressive stages of kidney development within a triangle-shaped region in latent space, whose vertices, or “archetypes,” resemble the cap mesenchyme, the nephrogenic stroma, and epithelial tubular structures of the fetal kidney. We identified a set of genes that are alternatively spliced between tumors located in different regions of latent space and found that many of these genes are associated with the Epithelial to Mesenchymal Transition (EMT) and muscle development. Using motif enrichment analysis, we identified putative splicing regulators, some of which are associated with kidney development. Our findings provide new insights into the etiology of Wilms’ tumors and suggest that specific splicing mechanisms in early stages of development may contribute to tumor development in different patients.

Show moreNov 2024 • Journal of Biomedical Optics 29 (3), 037003-037003, 2024

Zeev Kalyuzhner, Sergey Agdarov, Yevgeny Beiderman, Aviya Bennett, Yafim Beiderman, Zeev Zalevsky

ZZ

Intraocular pressure (IOP) measurements comprise an essential tool in modern medicine for the early diagnosis of glaucoma, the second leading cause of human blindness. The world's highest prevalence of glaucoma is in low-income countries.Current diagnostic methods require experience in running expensive equipment as well as the use of anesthetic eye drops. We present herein a remote photonic IOP biomonitoring method based on deep learning of secondary speckle patterns, captured by a fast camera, that are reflected from eye sclera stimulated by an external sound wave. By combining speckle pattern analysis with deep learning, high precision measurements are possible.

Show moreOct 2024 • Optics & Laser Technology

Ricardo Rubio-Oliver, Vicente Micó, Zeev Zalevsky, Javier García, Jose Angel Picazo-Bueno

ZZ

Digital holographic microscopy (DHM) is a very popular interferometric technique for quantitative phase imaging (QPI). In DHM, an interferometer is combined with a microscope to create interference between an imaging beam containing information about the analysed sample and a clear reference beam carrying no sample information. To exploit the capability of reference beam in terms of useful sample information, we have recently proposed Cepstrum-based Interferometric Microscopy (CIM) [Opt. Las. Tech. 174, 110,626 (2024)] as a novel methodology involving the interference of two imaging beams carrying different sample information and to accurately retrieve quantitative phase data of both beams. In the earlier implementation, proof-of-concept of CIM was demonstrated for a Michelson-based layout requiring manual adjustments during the CIM methodology and validated only for low numerical aperture (NA …

Show moreOct 2024 • arXiv preprint arXiv:2110.00494

Ofir Lindenbaum, Yariv Aizenbud, Yuval Kluger

OL

Anomalies (or outliers) are prevalent in real-world empirical observations and potentially mask important underlying structures. Accurate identification of anomalous samples is crucial for the success of downstream data analysis tasks. To automatically identify anomalies, we propose Probabilistic Robust AutoEncoder (PRAE). PRAE aims to simultaneously remove outliers and identify a low-dimensional representation for the inlier samples. We first present the Robust AutoEncoder (RAE) objective as a minimization problem for splitting the data into inliers and outliers. Our objective is designed to exclude outliers while including a subset of samples (inliers) that can be effectively reconstructed using an AutoEncoder (AE). RAE minimizes the autoencoder's reconstruction error while incorporating as many samples as possible. This could be formulated via regularization by subtracting an norm counting the number of selected samples from the reconstruction term. Unfortunately, this leads to an intractable combinatorial problem. Therefore, we propose two probabilistic relaxations of RAE, which are differentiable and alleviate the need for a combinatorial search. We prove that the solution to the PRAE problem is equivalent to the solution of RAE. We use synthetic data to show that PRAE can accurately remove outliers in a wide range of contamination levels. Finally, we demonstrate that using PRAE for anomaly detection leads to state-of-the-art results on various benchmark datasets.

Show moreOct 2024 • Nature nanotechnology

Longlong Wang, Ayan Mukherjee, Chang-Yang Kuo, Sankalpita Chakrabarty, Reut Yemini, Arrelaine A Dameron, Jaime W DuMont, Sri Harsha Akella, Arka Saha, Sarah Taragin, Hagit Aviv, Doron Naveh, Daniel Sharon, Ting-Shan Chan, Hong-Ji Lin, Jyh-Fu Lee, Chien-Te Chen, Boyang Liu, Xiangwen Gao, Suddhasatwa Basu, Zhiwei Hu, Doron Aurbach, Peter G Bruce, Malachi Noked

DN

A critical current challenge in the development of all-solid-state lithium batteries (ASSLBs) is reducing the cost of fabrication without compromising the performance. Here we report a sulfide ASSLB based on a high-energy, Co-free LiNiO2 cathode with a robust outside-in structure. This promising cathode is enabled by the high-pressure O2 synthesis and subsequent atomic layer deposition of a unique ultrathin LixAlyZnzOδ protective layer comprising a LixAlyZnzOδ surface coating region and an Al and Zn near-surface doping region. This high-quality artificial interphase enhances the structural stability and interfacial dynamics of the cathode as it mitigates the contact loss and continuous side reactions at the cathode/solid electrolyte interface. As a result, our ASSLBs exhibit a high areal capacity (4.65 mAh cm−2), a high specific cathode capacity (203 mAh g−1), superior cycling stability (92% capacity retention …

Show moreSep 2024 • arXiv preprint arXiv:2309.01347

Ashwin Ramasubramaniam, Doron Naveh

DN

Modulation of electronic properties of materials by electric fields is central to the operation of modern semiconductor devices, providing access to complex electronic behaviors and greater freedom in tuning the energy bands of materials. Here, we explore one-dimensional superlattices induced by a confining electrostatic potential in monolayer MoS, a prototypical two-dimensional semiconductor. Using first-principles calculations, we show that periodic potentials applied to monolayer MoS induce electrostatic superlattices in which the response is dominated by structural distortions relative to purely electronic effects. These structural distortions reduce the intrinsic band gap of the monolayer substantially while also polarizing the monolayer through piezoelectric coupling, resulting in spatial separation of charge carriers as well as Stark shifts that produce dispersive minibands. Importantly, these minibands inherit the valley-selective magnetic properties of monolayer MoS, enabling fine control over spin-valley coupling in MoS and similar transition-metal dichalcogenides.

Show moreAug 2024 • arXiv preprint arXiv:2308.14075

Gil Shapira, Yosi Keller

YK

In set-based face recognition, we aim to compute the most discriminative descriptor from an unbounded set of images and videos showing a single person. A discriminative descriptor balances two policies when aggregating information from a given set. The first is a quality-based policy: emphasizing high-quality and down-weighting low-quality images. The second is a diversity-based policy: emphasizing unique images in the set and down-weighting multiple occurrences of similar images as found in video clips which can overwhelm the set representation. This work frames face-set representation as a differentiable coreset selection problem. Our model learns how to select a small coreset of the input set that balances quality and diversity policies using a learned metric parameterized by the face quality, optimized end-to-end. The selection process is a differentiable farthest-point sampling (FPS) realized by approximating the non-differentiable Argmax operation with differentiable sampling from the Gumbel-Softmax distribution of distances. The small coreset is later used as queries in a self and cross-attention architecture to enrich the descriptor with information from the whole set. Our model is order-invariant and linear in the input set size. We set a new SOTA to set face verification on the IJB-B and IJB-C datasets. Our code is publicly available.

Show moreJul 2024 • Optics & Laser Technology

Ricardo Rubio-Oliver, Javier García, Zeev Zalevsky, José Ángel Picazo-Bueno, Vicente Micó

ZZ

A universal methodology for coding-decoding the complex amplitude field of an imaged sample in coherent microscopy is presented, where no restrictions on any of the two interferometric beams are required. Thus, the imaging beam can be overlapped with, in general, any other complex amplitude distribution and, in particular, with a coherent and shifted version of itself considering two orthogonal directions. The complex field values are retrieved by a novel Cepstrum-based algorithm, named as Spatial-Shifting Cepstrum (SSC), based on a weighted subtraction of the Cepstrum transform in the cross-correlation term of the object field spectrum in addition with the generation of a complex pupil from the combination of the information retrieved from different holographic recordings (one in horizontal and one in vertical direction) where one of the interferometric beams is shifted 1 pixel. As a result, the field of view is …

Show moreJul 2024 • Optics & Laser Technology

Ricardo Rubio-Oliver, Javier García, Zeev Zalevsky, José Ángel Picazo-Bueno, Vicente Micó

ZZ

A universal methodology for coding-decoding the complex amplitude field of an imaged sample in coherent microscopy is presented, where no restrictions on any of the two interferometric beams are required. Thus, the imaging beam can be overlapped with, in general, any other complex amplitude distribution and, in particular, with a coherent and shifted version of itself considering two orthogonal directions. The complex field values are retrieved by a novel Cepstrum-based algorithm, named as Spatial-Shifting Cepstrum (SSC), based on a weighted subtraction of the Cepstrum transform in the cross-correlation term of the object field spectrum in addition with the generation of a complex pupil from the combination of the information retrieved from different holographic recordings (one in horizontal and one in vertical direction) where one of the interferometric beams is shifted 1 pixel. As a result, the field of view is …

Show moreJul 2024 • arXiv preprint arXiv:2307.01874

Michael Suleymanov, Ismael L Paiva, Eliahu Cohen

EC

Quantum reference frames have attracted renewed interest recently, as their exploration is relevant and instructive in many areas of quantum theory. Among the different types, position and time reference frames have captivated special attention. Here, we introduce and analyze a non-relativistic framework in which each system contains an internal clock, in addition to its external (spatial) degree of freedom and, hence, can be used as a spatiotemporal quantum reference frame. Among other applications of this framework, we show that even in simple scenarios with no interactions, the relative uncertainty between clocks affects the relative spatial spread of the systems.

Show moreJun 2024 • Engineering Applications of Artificial Intelligence

Lior Rabkin, Ilan Cohen, Gonen Singer

GS

Ordinal classification tasks that require the allocation of limited resources are prevalent in various real-world scenarios. Examples include assessing disease severity in the context of medical resource allocation and categorizing the quality of machines as good, medium, or bad to schedule maintenance treatment within capacity constraints. We propose a comprehensive analytic framework for scenarios that, in addition to including ordinal classification problems, also have constraints on the number of classified samples of classes due to resource limitations. The framework uses a probability matrix generated by a trained ordinal classifier as the input for an optimization model with a minimum misclassification cost objective and resource allocation constraints. We illustrated the equivalence between the formulation of the resource allocation problem into samples and the transportation problem, enabling the utilization …

Show moreJun 2024 • arXiv preprint arXiv:2306.04785

Jonathan Svirsky, Ofir Lindenbaum

OL

Clustering is a fundamental learning task widely used as a first step in data analysis. For example, biologists often use cluster assignments to analyze genome sequences, medical records, or images. Since downstream analysis is typically performed at the cluster level, practitioners seek reliable and interpretable clustering models. We propose a new deep-learning framework that predicts interpretable cluster assignments at the instance and cluster levels. First, we present a self-supervised procedure to identify a subset of informative features from each data point. Then, we design a model that predicts cluster assignments and a gate matrix that leads to cluster-level feature selection. We show that the proposed method can reliably predict cluster assignments using synthetic and real data. Furthermore, we verify that our model leads to interpretable results at a sample and cluster level.

Show moreJun 2024 • arXiv preprint arXiv:2306.00582

Amit Rozner, Barak Battash, Henry Li, Lior Wolf, Ofir Lindenbaum

OL

Density estimation based anomaly detection schemes typically model anomalies as examples that reside in low-density regions. We propose a modified density estimation problem and demonstrate its effectiveness for anomaly detection. Specifically, we assume the density function of normal samples is uniform in some compact domain. This assumption implies the density function is more stable (with lower variance) around normal samples than anomalies. We first corroborate this assumption empirically using a wide range of real-world data. Then, we design a variance stabilized density estimation problem for maximizing the likelihood of the observed samples while minimizing the variance of the density around normal samples. We introduce an ensemble of autoregressive models to learn the variance stabilized distribution. Finally, we perform an extensive benchmark with 52 datasets demonstrating that our method leads to state-of-the-art results while alleviating the need for data-specific hyperparameter tuning.

Show moreMay 2024 • arXiv preprint arXiv:2405.03278

Amotz Bar-Noy, Toni Bohnlein, David Peleg, Yingli Ran, Dror Rawitz

DR

We study the question of whether a sequence d = (d_1,d_2, \ldots, d_n) of positive integers is the degree sequence of some outerplanar (a.k.a. 1-page book embeddable) graph G. If so, G is an outerplanar realization of d and d is an outerplanaric sequence. The case where \sum d \leq 2n - 2 is easy, as d has a realization by a forest (which is trivially an outerplanar graph). In this paper, we consider the family \cD of all sequences d of even sum 2n\leq \sum d \le 4n-6-2\multipl_1, where \multipl_x is the number of x's in d. (The second inequality is a necessary condition for a sequence d with \sum d\geq 2n to be outerplanaric.) We partition \cD into two disjoint subfamilies, \cD=\cD_{NOP}\cup\cD_{2PBE}, such that every sequence in \cD_{NOP} is provably non-outerplanaric, and every sequence in \cD_{2PBE} is given a realizing graph enjoying a 2-page book embedding (and moreover, one of the pages is also bipartite).

Show moreMay 2024 • Algorithmica

Magnús M Halldórsson, Dror Rawitz

DR

We study the Online Multiset Submodular Cover problem (OMSC), where we are given a universe U of elements and a collection of subsets . Each element is associated with a nonnegative, nondecreasing, submodular polynomially computable set function . Initially, the elements are uncovered, and therefore we pay a penalty per each unit of uncovered element. Subsets with various coverage and cost arrive online. Upon arrival of a new subset, the online algorithm must decide how many copies of the arriving subset to add to the solution. This decision is irrevocable, in the sense that the algorithm will not be able to add more copies of this subset in the future. On the other hand, the algorithm can drop copies of a subset, but such copies cannot be retrieved later. The goal is to minimize the total cost of subsets taken plus penalties for uncovered elements. We present an -competitive algorithm for OMSC that does not …

Show more