- Ragesh Jaiswal, Amit Kumar, and Jatin Yadav
FPT Approximation for Capacitated Sum of Radii, [ArXiv][ITCS'24]
Short Description: We improve the state of the art in FPT approximation algorithms for the capacitated sum of radii problem. We also show hardness-of-approximation under ETH.
|
- Ragesh Jaiswal
A Quantum Approximation Scheme for k-Means,
[arXiv][Poster at TQC'24]
Short Description: We give the first quantum algorithm with a polylogarithmic running time that gives a provable approximation guarantee of (1+ε) (i.e., an approximation scheme) for the k-means problem.
|
- Ragesh Jaiswal and Amit Kumar
Universal Weak Coreset,
[AAAI'24][A graphic novel presentation (PDF)]
Short Description: We extend the idea of a weak coreset to constrained settings while simultaneously breaking the "log n barrier" on the size of the coreset in general metric spaces.
|
- Ragesh Jaiswal and Amit Kumar
Clustering What Matters in Constrained Settings,
[arXiv][Slides][ISAAC'23 (best paper award)]
Short Description: We give an approximation-preserving, outlier to outlier-free reduction for any constrained k-median/means problem in arbitrary metric spaces, enabling developments in outlier-free versions to be carried over to their outlier counterparts.
|
- Dishant Goyal, and Ragesh Jaiswal,
Tight FPT Approximation for Constrained k-Center and k-Supplier,
[arXiv][Theoretical Computer Science]
Short Description: We give tight FPT algorithms for the k-center problem (min. the max. radius of k clusters) under arbitrary constraints on clusters. We show that any bi-criteria approximate solution for the unconstrained k-center problem contains an approximate solution for any constrained version of the k-center problem. This gives simple FPT algorithms (poly-time algorithm assuming k is constant) for a range of constrained k-center clustering problems, including a few recently formulated fair clustering problems. We also show that our result is tight under well-known complexity-theoretic assumptions.
|
- Dishant Goyal, and Ragesh Jaiswal,
Tight FPT Approximation for Socially Fair Clustering, [arXiv][Information Processing Letters]
Short Description: We extend a recent line of work on Socially Fair clustering by giving a tight FPT algorithm.
|
- Anup Bhattacharya, Ragesh Jaiswal, and Yoav Freund
On the k-means/median cost function.
[Information Processing Letters]
Short Description: The k-means/median cost decreases as k increases. In this work, we try to understand this behaviour more closely.
|
- Anup Bhattacharya, Dishant Goyal, and Ragesh Jaiswal,
Hardness of Approximation of Euclidean k-Median. APPROX'21.
[arXiv]
Short Description: We give a hardness of approximation result for the Euclidean k-median problem assuming the Unique Games Conjecture (UGC). This solves an open question posed in this work.
|
- Dishant Goyal, Ragesh Jaiswal, and Amit Kumar,
FPT Approximation for Constrained Metric k-Median/Means. IPEC'20.
[arXiv][IPEC]
Short Description: We give (3+ε)-approximation and (9+ε)-approximation for various constrained versions of k-median and k-means problems, respectively in FPT time.
|
- Anup Bhattacharya, Dishant Goyal, Ragesh Jaiswal, Amit Kumar,
On Sampling Based Algorithms for k-Means. FSTTCS'20.[FSTTCS]
Short Description: This is a combined presentation of the works below.
- Anup Bhattacharya, Dishant Goyal, Ragesh Jaiswal, and Amit Kumar,
Streaming PTAS for Binary ℓ0-Low Rank Approximation.
[arXiv]
Short Description: We give constant-pass, polylog-space streaming PTAS for the constrained binary k-means problem and the generalised binary ℓ0-rank-r approximation problem.
- Dishant Goyal, Ragesh Jaiswal, and Amit Kumar,
Streaming PTAS for Constrained k-Means.
[arXiv]
Short Description: We generalise our previous results and use it to obtain constant-pass, logspace streaming PTASs for a variety of constrained k-means problems as well as faster PTAS for k-means under stability.
|
- Ragesh Jaiswal and Amit Kumar,
Multiplicative Rank-1 Approximation using Length-Squared Sampling. SOSA@SODA'20.
[arXiv][Talk]
Short Description: We show that the span of Ω(1/ε4) rows of any matrix sampled according to the length-squared distribution contains a rank-1 matrix B that gives a (1+ε) multiplicative approximation under the Frobenius norm. Previous known results gave additive approximation.
|
- Ragesh Jaiswal,
A note on the relation between XOR and Selective XOR Lemmas.
[ Information Processing Letters]
[arXiv]
Short Description: We give a simple reduction from the Selective XOR lemma to the standard XOR lemma.
|
- Nir Ailon, Anup Bhattacharya, Ragesh Jaiswal, Amit Kumar,
Approximate Clustering with Same-Cluster Queries.
ITCS 2018. [arXiv][Talk]
Short Description: In this work, we explore the power and limitation of same-cluster queries in the context of approximate clustering.
|
- Ragesh Jaiswal, Sandeep Sen,
Approximate Clustering [PDF]
Book chapter to appear in Approximation Algorithms and Metaheuristics - Volume II
Short Description: Contains a survey on sampling-based clustering algorithms.
|
- Russell Impagliazzo, Ragesh Jaiswal, Valentine Kabanets, Bruce M. Kapron, Valerie King, and Stefano Tessaro,
Simultaneous Secrecy and Reliability Amplification for a General Channel Model,
TCC-B, 2016.[PDF]
Short Description: We present a general notion of channel for cryptographic purposes, which can model either a (classical) physical channel or the consequences of a cryptographic protocol or any hybrid. We consider simultaneous secrecy and reliability amplification for such channels. We show that simultaneous secrecy and reliability amplification is not possible for the most general model of the channel, but, at least for some values of the parameters, it is possible for a restricted class of channels that still includes both standard information-theoretic channels and keyless cryptographic protocols.
|
- Anup Bhattacharya, Ragesh Jaiswal, and Amit Kumar,
Faster Algorithms for the Constrained k-means Problem,
STACS 2016. Theory of Computing Systems [STACS special issue].
[arXiv] [Slides]
Short description: The classical center-based clustering problems such as k-means/median/center assume that the optimal clusters satisfy the locality property that the points in the same cluster are close to each other. A number of clustering problems arise in machine learning where the optimal clusters do not follow such a locality property. For instance, consider the r-gather clustering problem, where there is an additional constraint that each of the clusters should have at least r points or the capacitated clustering problem, where there is an upper bound on the cluster sizes. In this work, we consider an extreme generalisation of such problems by assuming that the target clustering is an arbitrary partition of the dataset. We show some interesting results in this scenario.
|
- Anup Bhattacharya, Davis Issac, Ragesh Jaiswal, and Amit Kumar,
Sampling in Space Restricted Settings,
COCOON 2015. Algorithmica [COCOON special issue].
[arXiv]
Short description: Consider the problem of sampling a uniformly random element from a data stream. This is a standard interview question, and the classic solution is to store the first item, and when item i (i>1) arrives, then with probability 1/i replace the currently stored element with item i and, with the remaining probability, do nothing. It is simple to argue that after seeing n items, the probability that the stored element is item i is exactly 1/n for all i<=n. The space required for this algorithm is O(log n) which is optimal. However, the randomness requirement of this algorithm is O(n log n) which is large. Randomness as a resource is relevant in computation since generating (pseudo)randomness costs energy. In the first part of this work, we show that the above goal can be achieved with O(log n) space and O(log n) random bits.
|