 Ragesh Jaiswal, Amit Kumar, and Jatin Yadav
FPT Approximation for Capacitated Sum of Radii, [ArXiv][ITCS'24]
Short Description: We improve the state of the art in FPT approximation algorithms for the capacitated sum of radii problem. We also show hardnessofapproximation under ETH.

 Ragesh Jaiswal
A Quantum Approximation Scheme for kMeans,
[arXiv][Poster at TQC'24]
Short Description: We give the first quantum algorithm with a polylogarithmic running time that gives a provable approximation guarantee of (1+ε) (i.e., an approximation scheme) for the kmeans problem.

 Ragesh Jaiswal and Amit Kumar
Universal Weak Coreset,
[AAAI'24][A graphic novel presentation (PDF)]
Short Description: We extend the idea of a weak coreset to constrained settings while simultaneously breaking the "log n barrier" on the size of the coreset in general metric spaces.

 Ragesh Jaiswal and Amit Kumar
Clustering What Matters in Constrained Settings,
[arXiv][Slides][ISAAC'23 (best paper award)]
Short Description: We give an approximationpreserving, outlier to outlierfree reduction for any constrained kmedian/means problem in arbitrary metric spaces, enabling developments in outlierfree versions to be carried over to their outlier counterparts.

 Dishant Goyal, and Ragesh Jaiswal,
Tight FPT Approximation for Constrained kCenter and kSupplier,
[arXiv][Theoretical Computer Science]
Short Description: We give tight FPT algorithms for the kcenter problem (min. the max. radius of k clusters) under arbitrary constraints on clusters. We show that any bicriteria approximate solution for the unconstrained kcenter problem contains an approximate solution for any constrained version of the kcenter problem. This gives simple FPT algorithms (polytime algorithm assuming k is constant) for a range of constrained kcenter clustering problems, including a few recently formulated fair clustering problems. We also show that our result is tight under wellknown complexitytheoretic assumptions.

 Dishant Goyal, and Ragesh Jaiswal,
Tight FPT Approximation for Socially Fair Clustering, [arXiv][Information Processing Letters]
Short Description: We extend a recent line of work on Socially Fair clustering by giving a tight FPT algorithm.

 Anup Bhattacharya, Ragesh Jaiswal, and Yoav Freund
On the kmeans/median cost function.
[Information Processing Letters]
Short Description: The kmeans/median cost decreases as k increases. In this work, we try to understand this behaviour more closely.

 Anup Bhattacharya, Dishant Goyal, and Ragesh Jaiswal,
Hardness of Approximation of Euclidean kMedian. APPROX'21.
[arXiv]
Short Description: We give a hardness of approximation result for the Euclidean kmedian problem assuming the Unique Games Conjecture (UGC). This solves an open question posed in this work.

 Dishant Goyal, Ragesh Jaiswal, and Amit Kumar,
FPT Approximation for Constrained Metric kMedian/Means. IPEC'20.
[arXiv][IPEC]
Short Description: We give (3+ε)approximation and (9+ε)approximation for various constrained versions of kmedian and kmeans problems, respectively in FPT time.

 Anup Bhattacharya, Dishant Goyal, Ragesh Jaiswal, Amit Kumar,
On Sampling Based Algorithms for kMeans. FSTTCS'20.[FSTTCS]
Short Description: This is a combined presentation of the works below.
 Anup Bhattacharya, Dishant Goyal, Ragesh Jaiswal, and Amit Kumar,
Streaming PTAS for Binary ℓ_{0}Low Rank Approximation.
[arXiv]
Short Description: We give constantpass, polylogspace streaming PTAS for the constrained binary kmeans problem and the generalised binary ℓ_{0}rankr approximation problem.
 Dishant Goyal, Ragesh Jaiswal, and Amit Kumar,
Streaming PTAS for Constrained kMeans.
[arXiv]
Short Description: We generalise our previous results and use it to obtain constantpass, logspace streaming PTASs for a variety of constrained kmeans problems as well as faster PTAS for kmeans under stability.

 Ragesh Jaiswal and Amit Kumar,
Multiplicative Rank1 Approximation using LengthSquared Sampling. SOSA@SODA'20.
[arXiv][Talk]
Short Description: We show that the span of Ω(1/ε^{4}) rows of any matrix sampled according to the lengthsquared distribution contains a rank1 matrix B that gives a (1+ε) multiplicative approximation under the Frobenius norm. Previous known results gave additive approximation.

 Ragesh Jaiswal,
A note on the relation between XOR and Selective XOR Lemmas.
[ Information Processing Letters]
[arXiv]
Short Description: We give a simple reduction from the Selective XOR lemma to the standard XOR lemma.

 Nir Ailon, Anup Bhattacharya, Ragesh Jaiswal, Amit Kumar,
Approximate Clustering with SameCluster Queries.
ITCS 2018. [arXiv][Talk]
Short Description: In this work, we explore the power and limitation of samecluster queries in the context of approximate clustering.

 Ragesh Jaiswal, Sandeep Sen,
Approximate Clustering [PDF]
Book chapter to appear in Approximation Algorithms and Metaheuristics  Volume II
Short Description: Contains a survey on samplingbased clustering algorithms.

 Russell Impagliazzo, Ragesh Jaiswal, Valentine Kabanets, Bruce M. Kapron, Valerie King, and Stefano Tessaro,
Simultaneous Secrecy and Reliability Amplification for a General Channel Model,
TCCB, 2016.[PDF]
Short Description: We present a general notion of channel for cryptographic purposes, which can model either a (classical) physical channel or the consequences of a cryptographic protocol or any hybrid. We consider simultaneous secrecy and reliability amplification for such channels. We show that simultaneous secrecy and reliability amplification is not possible for the most general model of the channel, but, at least for some values of the parameters, it is possible for a restricted class of channels that still includes both standard informationtheoretic channels and keyless cryptographic protocols.

 Anup Bhattacharya, Ragesh Jaiswal, and Amit Kumar,
Faster Algorithms for the Constrained kmeans Problem,
STACS 2016. Theory of Computing Systems [STACS special issue].
[arXiv] [Slides]
Short description: The classical centerbased clustering problems such as kmeans/median/center assume that the optimal clusters satisfy the locality property that the points in the same cluster are close to each other. A number of clustering problems arise in machine learning where the optimal clusters do not follow such a locality property. For instance, consider the rgather clustering problem, where there is an additional constraint that each of the clusters should have at least r points or the capacitated clustering problem, where there is an upper bound on the cluster sizes. In this work, we consider an extreme generalisation of such problems by assuming that the target clustering is an arbitrary partition of the dataset. We show some interesting results in this scenario.

 Anup Bhattacharya, Davis Issac, Ragesh Jaiswal, and Amit Kumar,
Sampling in Space Restricted Settings,
COCOON 2015. Algorithmica [COCOON special issue].
[arXiv]
Short description: Consider the problem of sampling a uniformly random element from a data stream. This is a standard interview question, and the classic solution is to store the first item, and when item i (i>1) arrives, then with probability 1/i replace the currently stored element with item i and, with the remaining probability, do nothing. It is simple to argue that after seeing n items, the probability that the stored element is item i is exactly 1/n for all i<=n. The space required for this algorithm is O(log n) which is optimal. However, the randomness requirement of this algorithm is O(n log n) which is large. Randomness as a resource is relevant in computation since generating (pseudo)randomness costs energy. In the first part of this work, we show that the above goal can be achieved with O(log n) space and O(log n) random bits.
