Author Description

Login to generate an author description

Ask a Question About This Mathematician

We show that every universally truthful randomized mechanism for combinatorial auctions with submodular valuations that provides an approximation ratio of m1/ 2 -ε must use exponentially many value queries, where … We show that every universally truthful randomized mechanism for combinatorial auctions with submodular valuations that provides an approximation ratio of m1/ 2 -ε must use exponentially many value queries, where m is the number of items. In contrast, ignoring incentives there exist constant ratio approximation algorithms for this problem. Our approach is based on a novel direct hardness technique that completely skips the notoriously hard step of characterizing truthful mechanisms. The characterization step was the main obstacle for proving impossibility results in algorithmic mechanism design so far. We demonstrate two additional applications of our new technique: (1) an impossibility result for universally-truthful polynomial time flexible combinatorial public projects and (2) an impossibility result for truthful-in-expectation mechanisms for exact combinatorial public projects. The latter is the first result that bounds the power of polynomial-time truthful in expectation mechanisms in any setting.
We consider the problem of designing a revenue-maximizing auction for a single item, when the values of the bidders are drawn from a correlated distribution. We observe that there exists … We consider the problem of designing a revenue-maximizing auction for a single item, when the values of the bidders are drawn from a correlated distribution. We observe that there exists an algorithm that finds the optimal randomized mechanism that runs in time polynomial in the size of the support. We leverage this result to show that in the oracle model introduced by Ronen and Saberi [FOCS'02], there exists a polynomial time truthful in expectation mechanism that provides a (1.5+ε)-approximation to the revenue achievable by an optimal truthful-in-expectation mechanism, and a polynomial time deterministic truthful mechanism that guarantees 5/3 approximation to the revenue achievable by an optimal deterministic truthful mechanism.
One of the fundamental questions of Algorithmic Mechanism Design is whether there exists an inherent clash between truthfulness and computational tractability: in particular, whether polynomial-time truthful mechanisms for combinatorial auctions … One of the fundamental questions of Algorithmic Mechanism Design is whether there exists an inherent clash between truthfulness and computational tractability: in particular, whether polynomial-time truthful mechanisms for combinatorial auctions are provably weaker in terms of approximation ratio than non-truthful ones. This question was very recently answered for universally truthful mechanisms for combinatorial auctions [4], and even for truthful-in-expectation mechanisms [12]. However, both of these results are based on information-theoretic arguments for valuations given by a value oracle, and leave open the possibility of polynomial-time truthful mechanisms for succinctly described classes of valuations.
In many settings the power of truthful mechanisms is severely bounded. In this paper we use randomization to overcome this problem. In particular, we construct an FPTAS for multi-unit auctions … In many settings the power of truthful mechanisms is severely bounded. In this paper we use randomization to overcome this problem. In particular, we construct an FPTAS for multi-unit auctions that is truthful in expectation, whereas there is evidence that no polynomial-time truthful deterministic mechanism provides an approximation ratio better than 2. We also show for the first time that truthful in expectation polynomial-time mechanisms are \emph{provably} stronger than polynomial-time universally truthful mechanisms. Specifically, we show that there is a setting in which: (1) there is a non-polynomial time truthful mechanism that always outputs the optimal solution, and that (2) no universally truthful randomized mechanism can provide an approximation ratio better than 2 in polynomial time, but (3) an FPTAS that is truthful in expectation exists.
We present an incentive-compatible polynomial-time approximation scheme for multi-unit auctions with general k-minded playervaluations. The mechanism fully optimizes over an appropriately chosen sub-range of possible allocations and then uses VCG … We present an incentive-compatible polynomial-time approximation scheme for multi-unit auctions with general k-minded playervaluations. The mechanism fully optimizes over an appropriately chosen sub-range of possible allocations and then uses VCG payments over this sub-range. We show that obtaining a fully polynomial-time incentive-compatible approximation scheme, at least using VCG payments, is NP-hard. For the case of valuations given by black boxes, we give a polynomial-time incentive-compatible 2-approximation mechanism and show that no better is possible, at least using VCG payments.
We study the necessity of interaction between individuals for obtaining approximately efficient economic allocations. We view this as a formalization of Hayek's classic point of view that focuses on the … We study the necessity of interaction between individuals for obtaining approximately efficient economic allocations. We view this as a formalization of Hayek's classic point of view that focuses on the information transfer advantages that markets have relative to centralized planning. We study two settings: combinatorial auctions with unit demand bidders (bipartite matching) and combinatorial auctions with subadditive bidders. In both settings we prove that non-interactive protocols require exponentially larger communication costs than interactive ones, even those that only use a modest amount of interaction.
We present an incentive-compatible polynomial-time approximation scheme for multi-unit auctions with general k-minded player valuations. The mechanism fully optimizes over an appropriately chosen sub-range of possible allocations and then uses … We present an incentive-compatible polynomial-time approximation scheme for multi-unit auctions with general k-minded player valuations. The mechanism fully optimizes over an appropriately chosen sub-range of possible allocations and then uses VCG payments over this sub-range. We show that obtaining a fully polynomial-time incentive-compatible approximation scheme, at least using VCG payments, is NP-hard. For the case of valuations given by black boxes, we give a polynomial-time incentive-compatible 2-approximation mechanism and show that no better is possible, at least using VCG payments.
We study a central problem in Algorithmic Mechanism Design: constructing truthful mechanisms for welfare maximization in combinatorial auctions with submodular bidders. Dobzinski, Nisan, and Schapira provided the first mechanism that … We study a central problem in Algorithmic Mechanism Design: constructing truthful mechanisms for welfare maximization in combinatorial auctions with submodular bidders. Dobzinski, Nisan, and Schapira provided the first mechanism that guarantees a non-trivial approximation ratio of O(log^2 m) [STOC'06], where m is the number of items. This was subsequently improved to O( log m log log m) [Dobzinski, APPROX'07] and then to O(m) [Krysta and Vocking, ICALP'12]. In this paper we develop the first mechanism that breaks the logarithmic barrier. Specifically, the mechanism provides an approximation ratio of O( m). Similarly to previous constructions, our mechanism uses polynomially many value and demand queries, and in fact provides the same approximation ratio for the larger class of XOS (a.k.a. fractionally subadditive) valuations. We also develop a computationally efficient implementation of the mechanism for combinatorial auctions with budget additive bidders. Although in general computing a demand query is NP-hard for budget additive valuations, we observe that the specific form of demand queries that our mechanism uses can be efficiently computed when bidders are budget additive.
We consider reallocation problems in settings where the initial endowment of each agent consists of a subset of the resources. The private information of the players is their value for … We consider reallocation problems in settings where the initial endowment of each agent consists of a subset of the resources. The private information of the players is their value for every possible subset of the resources. The goal is to redistribute resources among agents to maximize efficiency. Monetary transfers are allowed, but participation is voluntary.
We characterize the communication complexity of truthful mechanisms. Our departure point is the well known taxation principle. The taxation principle asserts that every truthful mechanism can be interpreted as follows: … We characterize the communication complexity of truthful mechanisms. Our departure point is the well known taxation principle. The taxation principle asserts that every truthful mechanism can be interpreted as follows: every player is presented with a menu that consists of a price for each bundle (the prices depend only on the valuations of the other players). Each player is allocated a bundle that maximizes his profit according to this menu. We define the taxation complexity of a truthful mechanism to be the logarithm of the maximum number of menus that may be presented to a player. Our main finding is that in general the taxation complexity essentially equals the communication complexity. The proof consists of two main steps. First, we prove that for rich enough domains the taxation complexity is at most the communication complexity. We then show that the taxation complexity is much smaller than the communication complexity only in "pathological" cases and provide a formal description of these extreme cases. Next, we study mechanisms that access the valuations via value queries only. In this setting we establish that the menu complexity - a notion that was already studied in several different contexts - characterizes the number of value queries that the mechanism makes in exactly the same way that the taxation complexity characterizes the communication complexity. Our approach yields several applications, including strengthening the solution concept with low communication overhead, fast computation of prices, and hardness of approximation by computationally efficient truthful mechanisms.
We study combinatorial procurement auctions, where a buyer with a valuation function v and budget B wishes to buy a set of items. Each item i has a cost ci … We study combinatorial procurement auctions, where a buyer with a valuation function v and budget B wishes to buy a set of items. Each item i has a cost ci and the buyer is interested in a set S that maximizes v(S) subject to ∑i∈Sci ≤ β. Special cases of combinatorial procurement auctions are well-studied problems from submodular optimization. In particular, when the costs are all equal (cardinality constraint), a classic result by Nemhauser et al shows that the greedy algorithm provides an e/e-1 approximation.
We study the bilateral trade problem: one seller, one buyer and a single, indivisible item for sale. It is well known that there is no fully-efficient and incentive compatible mechanism … We study the bilateral trade problem: one seller, one buyer and a single, indivisible item for sale. It is well known that there is no fully-efficient and incentive compatible mechanism for this problem that maintains a balanced budget. We design simple and robust mechanisms that obtain approximate efficiency with these properties. We show that even minimal use of statistical data can yield good approximation results. Finally, we demonstrate how a mechanism for this simple bilateral-trade problem can be used as a black-box for constructing mechanisms in more general environments.
This short note exhibits a truthful-in-expectation $O(\frac {\log m} {\log \log m})$-approximation mechanism for combinatorial auctions with subadditive bidders that uses polynomial communication. This short note exhibits a truthful-in-expectation $O(\frac {\log m} {\log \log m})$-approximation mechanism for combinatorial auctions with subadditive bidders that uses polynomial communication.
We study combinatorial auctions where each item is sold separately but simultaneously via a second price auction. We ask whether it is possible to efficiently compute in this game a … We study combinatorial auctions where each item is sold separately but simultaneously via a second price auction. We ask whether it is possible to efficiently compute in this game a pure Nash equilibrium with social welfare close to the optimal one. We show that when the valuations of the bidders are submodular, in many interesting settings (e.g., constant number of bidders, budget additive bidders) computing an equilibrium with good welfare is essentially as easy as computing, completely ignoring incentives issues, an allocation with good welfare. On the other hand, for subadditive valuations, we show that computing an equilibrium requires exponential communication. Finally, for XOS (a.k.a. fractionally subadditive) valuations, we show that if there exists an efficient algorithm that finds an equilibrium, it must use techniques that are very different from the ones currently known.
We study combinatorial auctions with bidders that exhibit endowment effect. In most of the previous work on cognitive biases in algorithmic game theory (e.g., [Kleinberg and Oren, EC'14] and its … We study combinatorial auctions with bidders that exhibit endowment effect. In most of the previous work on cognitive biases in algorithmic game theory (e.g., [Kleinberg and Oren, EC'14] and its follow-ups) the focus was on analyzing the implications and mitigating their negative consequences. In contrast, in this paper we show how in some cases cognitive biases can be harnessed to obtain better outcomes. Specifically, we study Walrasian equilibria in combinatorial markets. It is well known that Walrasian equilibria exist only in limited settings, e.g., when all valuations are gross substitutes, but fails to exist in more general settings, e.g., when the valuations are submodular. We consider combinatorial settings in which bidders exhibit the endowment effect, that is, their value for items increases with ownership. Our main result shows that when the valuations are submodular, even a mild degree of endowment effect is sufficient to guarantee the existence of Walrasian equilibria. In fact, we show that in contrast to Walrasian equilibria with standard utility maximizing bidders -- in which the equilibrium allocation must be efficient -- when bidders exhibit endowment effect any local optimum can be an equilibrium allocation. Our techniques reveal interesting connections between the LP relaxation of combinatorial auctions and local maxima. We also provide lower bounds on the intensity of the endowment effect that the bidders must have in order to guarantee the existence of a Walrasian equilibrium in various settings.
We study a communication variant of local search. There is some fixed, commonly known graph G. Alice holds fA and Bob holds fB, both are functions that specify a value … We study a communication variant of local search. There is some fixed, commonly known graph G. Alice holds fA and Bob holds fB, both are functions that specify a value for each vertex. The goal is to find a local maximum of fA+fB with respect to G, i.e., a vertex v for which (fA+fB)(v)≥ (fA+fB)(u) for each neighbor u of v.
Many large decentralized systems rely on information propagation to ensure their proper function. We examine a common scenario in which only participants that are aware of the information can compete … Many large decentralized systems rely on information propagation to ensure their proper function. We examine a common scenario in which only participants that are aware of the information can compete for some reward, and thus informed participants have an incentive not to propagate information to others. One recent example in which such tension arises is the 2009 DARPA Network Challenge (finding red balloons). We focus on another prominent example: Bitcoin, a decentralized electronic currency system. Bitcoin represents a radical new approach to monetary systems. It has been getting a large amount of public attention over the last year, both in policy discussions and in the popular press. Its cryptographic fundamentals have largely held up even as its usage has become increasingly widespread. We find, however, that it exhibits a fundamental problem of a different nature, based on how its incentives are structured. We propose a modification to the protocol that can eliminate this problem. Bitcoin relies on a peer-to-peer network to track transactions that are performed with the currency. For this purpose, every transaction a node learns about should be transmitted to its neighbors in the network. The current implemented protocol provides an incentive to nodes to not broadcast transactions they are aware of. Our solution is to augment the protocol with a scheme that rewards information propagation. Since clones are easy to create in the Bitcoin system, an important feature of our scheme is Sybil-proofness. We show that our proposed scheme succeeds in setting the correct incentives, that it is Sybil-proof, and that it requires only a small payment overhead, all this is achieved with iterated elimination of dominated strategies. We complement this result by showing that there are no reward schemes in which information propagation and no self-cloning is a dominant strategy.
We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known … We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known cost function specifies the cost of providing every possible combination of services. A combinatorial cost sharing mechanism is a protocol that decides which services each player gets and at what price. We look for dominant strategy mechanisms that are (economically) efficient and cover the cost, ideally without overcharging (i.e., budget balanced). Note that unlike the standard cost sharing setting, combinatorial cost sharing is a multi-parameter domain. This makes designing dominant strategy mechanisms with good guarantees a challenging task.
We study the power of polynomial-time truthful mechanisms comparing to polynomial time (non-truthful) algorithms. We show that there is a setting in which deterministic polynomial-time truthful mechanisms cannot guarantee a … We study the power of polynomial-time truthful mechanisms comparing to polynomial time (non-truthful) algorithms. We show that there is a setting in which deterministic polynomial-time truthful mechanisms cannot guarantee a bounded approximation ratio, but a non-truthful FPTAS exists. We also show that in the same setting there is a universally truthful mechanism that provides an approximation ratio of 2. This shows that the cost of truthfulness is unbounded. The proofs are almost standard in the field and follow from known results.
We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known … We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known cost function specifies the cost of providing every possible combination of services. A combinatorial cost sharing mechanism is a protocol that decides which services each player gets and at what price. We look for dominant strategy mechanisms that are (economically) efficient and cover the cost, ideally without overcharging (i.e., budget balanced). Note that unlike the standard cost sharing setting, combinatorial cost sharing is a multi-parameter domain. This makes designing dominant strategy mechanisms with good guarantees a challenging task. We present the Potential Mechanism -- a combination of the VCG mechanism and a well-known tool from the theory of cooperative games: Hart and Mas-Colell's potential function. The potential mechanism is a dominant strategy mechanism that always covers the incurred cost. When the cost function is subadditive the same mechanism is also approximately efficient. Our main technical contribution shows that when the cost function is submodular the potential mechanism is approximately budget balanced in three settings: supermodular valuations, symmetric cost function and general symmetric valuations, and two players with general valuations.
The class of gross substitutes (GS) set functions plays a central role in Economics and Computer Science. GS belongs to the hierarchy of complement free valuations introduced by Lehmann, Lehmann … The class of gross substitutes (GS) set functions plays a central role in Economics and Computer Science. GS belongs to the hierarchy of complement free valuations introduced by Lehmann, Lehmann and Nisan, along with other prominent classes: GS ⊊ Submodular ⊊ XOS ⊊ Subadditive$. The GS class has always been more enigmatic than its counterpart classes, both in its definition and in its relation to the other classes. For example, while it is well understood how closely the Submodular, XOS and Subadditive classes (point-wise) approximate one another, approximability of these classes by GS remained wide open. In particular, the largest gap known between Submodular and GS valuations was some constant ratio smaller than 2. Our main result is the existence of a submodular valuation (one that is also budget additive) that cannot be approximated by GS within a ratio better than $Ømega(łog m/łogłog m), where m is the number of items. En route, we uncover a new symmetrization operation that preserves GS, which may be of independent interest. We show that our main result is tight with respect to budget additive valuations. However, whether GS approximates general submodular valuations within a poly-logarithmic factor remains open, even in the special case of concave of GS valuations (a subclass of Submodular containing budget additive). For concave of Rado valuations (Rado is a significant subclass of GS, containing, e.g., weighted matroid rank functions and OXS), we show approximability by GS within an O(łog2m) factor.
We study the communication complexity of dominant strategy implementations of combinatorial auctions. We start with two domains that are generally considered "easy": multi-unit auctions with decreasing marginal values and combinatorial … We study the communication complexity of dominant strategy implementations of combinatorial auctions. We start with two domains that are generally considered "easy": multi-unit auctions with decreasing marginal values and combinatorial auctions with gross substitutes valuations. For both domains we have fast algorithms that find the welfare-maximizing allocation with communication complexity that is poly-logarithmic in the input size. This immediately implies that welfare maximization can be achieved in ex-post equilibrium with no significant communication cost, by using VCG payments. In contrast, we show that in both domains the communication complexity of any dominant strategy implementation that achieves the optimal welfare is polynomial in the input size.
We present a new type of monotone submodular functions: \emph{multi-peak submodular functions}. Roughly speaking, given a family of sets $\cF$, we construct a monotone submodular function $f$ with a high … We present a new type of monotone submodular functions: \emph{multi-peak submodular functions}. Roughly speaking, given a family of sets $\cF$, we construct a monotone submodular function $f$ with a high value $f(S)$ for every set $S \in {\cF}$ (a "peak"), and a low value on every set that does not intersect significantly any set in $\cF$. We use this construction to show that a better than $(1-\frac{1}{2e})$-approximation ($\simeq 0.816$) for welfare maximization in combinatorial auctions with submodular valuations is (1) impossible in the communication model, (2) NP-hard in the computational model where valuations are given explicitly. Establishing a constant approximation hardness for this problem in the communication model was a long-standing open question. The valuations we construct for the hardness result in the computational model depend only on a constant number of items, and hence the result holds even if the players can answer arbitrary queries about their valuation, including demand queries. We also study two other related problems that received some attention recently: max-min allocation (for which we also get hardness of $(1-\frac 1 {2e}+ε)$-approximation, in both models), and combinatorial public projects (for which we prove hardness of $(3/4+ε)$-approximation in the communication model, and hardness of $(1 -\frac 1 e+ε)$-approximation in the computational model, using constant size valuations).
The problem of maximizing a non-negative submodular function was introduced by Feige, Mirrokni, and Vondrak [FOCS'07] who provided a deterministic local-search based algorithm that guarantees an approximation ratio of $\frac … The problem of maximizing a non-negative submodular function was introduced by Feige, Mirrokni, and Vondrak [FOCS'07] who provided a deterministic local-search based algorithm that guarantees an approximation ratio of $\frac 1 3$, as well as a randomized $\frac 2 5$-approximation algorithm. An extensive line of research followed and various algorithms with improving approximation ratios were developed, all of them are randomized. Finally, Buchbinder et al. [FOCS'12] presented a randomized $\frac 1 2$-approximation algorithm, which is the best possible. This paper gives the first deterministic algorithm for maximizing a non-negative submodular function that achieves an approximation ratio better than $\frac 1 3$. The approximation ratio of our algorithm is $\frac 2 5$. Our algorithm is based on recursive composition of solutions obtained by the local search algorithm of Feige et al. We show that the $\frac 2 5$ approximation ratio can be guaranteed when the recursion depth is $2$, and leave open the question of whether the approximation ratio improves as the recursion depth increases.
We present fast algorithms for sketching valuation functions. Let N (| N | = n ) be some ground set and v :2 N → R be a function. We … We present fast algorithms for sketching valuation functions. Let N (| N | = n ) be some ground set and v :2 N → R be a function. We say that v˜:2 N → R is an α-sketch of v if for every set S we have that v ( S )/α ≤ v˜( S ) ≤ v ( S ) and v˜ can be described in poly ( n ) bits. Goemans et al. [SODA’09] showed that if v is submodular then there exists an õ (√ n )-sketch that can be constructed using polynomially many value queries (this is essentially the best possible, as Balcan and Harvey [STOC’11] show that no submodular function admits an n 1/3 - ϵ -sketch). Based on their work, Balcan et al. [COLT’12] and Badanidiyuru et al. [SODA’12] show that if v is subadditive, then there exists an õ (√ n )-sketch that can be constructed using polynomially many demand queries. All previous sketches are based on complicated geometric constructions. The first step in their constructions is proving the existence of a good sketch by finding an ellipsoid that “approximates” v well (this is done by applying John’s theorem to ensure the existence of an ellipsoid that is “close” to the polymatroid that is associated with v ). The second step is to show that this ellipsoid can be found efficiently, and this is done by repeatedly solving a certain convex program to obtain better approximations of John’s ellipsoid. In this article, we give a significantly simpler, nongeometric proof for the existence of good sketches and utilize the proof to obtain much faster algorithms that match the previously obtained approximation bounds. Specifically, we provide an algorithm that finds õ (√ n )-sketch of a submodular function with only õ ( n 3/2 value queries, and we provide an algorithm that finds õ (√ n )-sketch of a subadditive function with O ( n ) demand and value queries.
Related DatabasesWeb of Science You must be logged in with an active subscription to view this.Article DataHistorySubmitted: 12 September 2016Accepted: 11 August 2017Published online: 21 October 2019Keywordscombinatorial auctions, truthful mechanisms, … Related DatabasesWeb of Science You must be logged in with an active subscription to view this.Article DataHistorySubmitted: 12 September 2016Accepted: 11 August 2017Published online: 21 October 2019Keywordscombinatorial auctions, truthful mechanisms, approximation algorithmsAMS Subject Headings91B99Publication DataISSN (print): 0097-5397ISSN (online): 1095-7111Publisher: Society for Industrial and Applied MathematicsCODEN: smjcat
We study the bilateral trade problem: one seller, one buyer and a single, indivisible item for sale. It is well known that there is no fully-efficient and incentive compatible mechanism … We study the bilateral trade problem: one seller, one buyer and a single, indivisible item for sale. It is well known that there is no fully-efficient and incentive compatible mechanism for this problem that maintains a balanced budget. We design simple and robust mechanisms that obtain approximate efficiency with these properties. We show that even minimal use of statistical data can yield good approximation results. Finally, we demonstrate how a mechanism for this simple bilateral-trade problem can be used as a "black-box" for constructing mechanisms in more general environments.
We study the following communication variant of local search. There is some fixed, commonly known graph $G$. Alice holds $f_A$ and Bob holds $f_B$, both are functions that specify a … We study the following communication variant of local search. There is some fixed, commonly known graph $G$. Alice holds $f_A$ and Bob holds $f_B$, both are functions that specify a value for each vertex. The goal is to find a local maximum of $f_A+f_B$ with respect to $G$, i.e., a vertex $v$ for which $(f_A+f_B)(v)\geq (f_A+f_B)(u)$ for every neighbor $u$ of $v$. Our main result is that finding a local maximum requires polynomial (in the number of vertices) bits of communication. The result holds for the following families of graphs: three dimensional grids, hypercubes, odd graphs, and degree 4 graphs. Moreover, we provide an \emph{optimal} communication bound of $\Omega(\sqrt{N})$ for the hypercube, and for a constant dimensional greed, where $N$ is the number of vertices in the graph. We provide applications of our main result in two domains, exact potential games and combinatorial auctions. First, we show that finding a pure Nash equilibrium in $2$-player $N$-action exact potential games requires polynomial (in $N$) communication. We also show that finding a pure Nash equilibrium in $n$-player $2$-action exact potential games requires exponential (in $n$) communication. The second domain that we consider is combinatorial auctions, in which we prove that finding a local maximum in combinatorial auctions requires exponential (in the number of items) communication even when the valuations are submodular. Each one of the results demonstrates an exponential separation between the non-deterministic communication complexity and the randomized communication complexity of a total search problem.
We study the necessity of interaction between individuals for obtaining approximately efficient allocations. The role of interaction in markets has received significant attention in economic thinking, e.g. in Hayek's 1945 … We study the necessity of interaction between individuals for obtaining approximately efficient allocations. The role of interaction in markets has received significant attention in economic thinking, e.g. in Hayek's 1945 classic paper. We consider this problem in the framework of simultaneous communication complexity. We analyze the amount of simultaneous communication required for achieving an approximately efficient allocation. In particular, we consider two settings: combinatorial auctions with unit demand bidders (bipartite matching) and combinatorial auctions with subadditive bidders. For both settings we first show that non-interactive systems have enormous communication costs relative to interactive ones. On the other hand, we show that limited interaction enables us to find approximately efficient allocations.
In settings where players have a limited access to liquidity, represented in the form of budget constraints, efficiency maximization has proven to be a challenging goal. In particular, the social … In settings where players have a limited access to liquidity, represented in the form of budget constraints, efficiency maximization has proven to be a challenging goal. In particular, the social welfare cannot be approximated by a better factor then the number of players. Therefore, the literature has mainly resorted to Pareto-efficiency as a way to achieve efficiency in such settings. While successful in some important scenarios, in many settings it is known that either exactly one incentive-compatible auction that always outputs a Pareto-efficient solution, or that no truthful mechanism can always guarantee a Pareto-efficient outcome. Traditionally, impossibility results can be avoided by considering approximations. However, Pareto-efficiency is a binary property (is either satisfied or not), which does not allow for approximations. In this paper we propose a new notion of efficiency, called \emph{liquid welfare}. This is the maximum amount of revenue an omniscient seller would be able to extract from a certain instance. We explain the intuition behind this objective function and show that it can be 2-approximated by two different auctions. Moreover, we show that no truthful algorithm can guarantee an approximation factor better than 4/3 with respect to the liquid welfare, and provide a truthful auction that attains this bound in a special case. Importantly, the liquid welfare benchmark also overcomes impossibilities for some settings. While it is impossible to design Pareto-efficient auctions for multi-unit auctions where players have decreasing marginal values, we give a deterministic $O(\log n)$-approximation for the liquid welfare in this setting.
We exhibit incentive compatible multi-unit auctions that are not affine maximizers (i.e., are not of the VCG family) and yet approximate the social welfare to within a factor of $1+\epsilon$. … We exhibit incentive compatible multi-unit auctions that are not affine maximizers (i.e., are not of the VCG family) and yet approximate the social welfare to within a factor of $1+\epsilon$. For the case of two-item two-bidder auctions we show that these auctions, termed Triage auctions, are the only scalable ones that give an approximation factor better than 2. Scalable means that the allocation does not depend on the units in which the valuations are measured. We deduce from this that any scalable computationally-efficient incentive-compatible auction for $m$ items and $n \ge 2$ bidders cannot approximate the social welfare to within a factor better than 2. This is in contrast to arbitrarily good approximations that can be reached under computational constraints alone, and in contrast to the fact that the optimal social welfare can be obtained under incentive constraints alone.
We study \emph{combinatorial procurement auctions}, where a buyer with a valuation function $v$ and budget $B$ wishes to buy a set of items. Each item $i$ has a cost $c_i$ … We study \emph{combinatorial procurement auctions}, where a buyer with a valuation function $v$ and budget $B$ wishes to buy a set of items. Each item $i$ has a cost $c_i$ and the buyer is interested in a set $S$ that maximizes $v(S)$ subject to $\Sigma_{i\in S}c_i\leq B$. Special cases of combinatorial procurement auctions are classical problems from submodular optimization. In particular, when the costs are all equal (\emph{cardinality constraint}), a classic result by Nemhauser et al shows that the greedy algorithm provides an $\frac e {e-1}$ approximation. Motivated by many papers that utilize demand queries to elicit the preferences of agents in economic settings, we develop algorithms that guarantee improved approximation ratios in the presence of demand oracles. We are able to break the $\frac e {e-1}$ barrier: we present algorithms that use only polynomially many demand queries and have approximation ratios of $\frac 9 8+\epsilon$ for the general problem and $\frac 9 8$ for maximization subject to a cardinality constraint. We also consider the more general class of subadditive valuations. We present algorithms that obtain an approximation ratio of $2+\epsilon$ for the general problem and 2 for maximization subject to a cardinality constraint. We guarantee these approximation ratios even when the valuations are non-monotone. We show that these ratios are essentially optimal, in the sense that for any constant $\epsilon>0$, obtaining an approximation ratio of $2-\epsilon$ requires exponentially many demand queries.
We analyze the revenue loss due to market shrinkage. Specifically, consider a simple market with one item for sale and n bidders whose values are drawn from some joint distribution. … We analyze the revenue loss due to market shrinkage. Specifically, consider a simple market with one item for sale and n bidders whose values are drawn from some joint distribution. Suppose that the market shrinks as a single bidder retires from the market. Suppose furthermore that the value of this retiring bidder is fixed and always strictly smaller than the values of the other bidders. We show that even this slight decrease in competition might cause a significant fall of a multiplicative factor of 1/e+1 ~0.268 in the revenue that can be obtained by a dominant strategy ex-post individually rational mechanism. In particular, our results imply a solution to an open question that was posed by Dobzinski, Fu, and Kleinberg [STOC'11].
Abstract The problem of scheduling unrelated machines by a truthful mechanism to minimize the makespan was introduced in the seminal "Algorithmic Mechanism Design" paper by Nisan and Ronen. Nisan and … Abstract The problem of scheduling unrelated machines by a truthful mechanism to minimize the makespan was introduced in the seminal "Algorithmic Mechanism Design" paper by Nisan and Ronen. Nisan and Ronen showed that there is a truthful mechanism that provides an approximation ratio of min( m , n ), where n is the number of machines and m is the number of jobs. They also proved that no truthful mechanism can provide an approximation ratio better than 2. Since then, the lower bound was improved to 1 + √2 ≈ 2.41 by Christodoulou, Kotsoupias, and Vidali, and then to 1 + Φ ≈ 2.618 by Kotsoupias and Vidali. The lower bound was improved to 2.755 by Giannakopoulos, Hammerl, and Pocas. In this paper we further improve the bound to 3- δ , for every constant δ > 0. Note that a gap between the upper bound and the lower bounds exists even when the number of machines and jobs is very small. In particular, the known 1 + √2 lower bound requires at least 3 machines and 5 jobs. In contrast, we show a lower bound of 2.2055 that uses only 3 machines and 3 jobs and a lower bound of 1 + √2 that uses only 3 machines and 4 jobs. For the case of two machines and two jobs we show a lower bound of 2. Similar bounds for two machines and two jobs were known before but only via complex proofs that characterized all truthful mechanisms that provide a finite approximation ratio in this setting, whereas our new proof uses a simple and direct approach.
Let $(f,P)$ be an incentive compatible mechanism where $f$ is the social choice function and $P$ is the payment function. In many important settings, $f$ uniquely determines $P$ (up to … Let $(f,P)$ be an incentive compatible mechanism where $f$ is the social choice function and $P$ is the payment function. In many important settings, $f$ uniquely determines $P$ (up to a constant) and therefore a common approach is to focus on the design of $f$ and neglect the role of the payment function. Fadel and Segal [JET, 2009] question this approach by taking the lenses of communication complexity: can it be that the communication complexity of an incentive compatible mechanism that implements $f$ (that is, computes both the output and the payments) is much larger than the communication complexity of computing the output? I.e., can it be that $cc_{IC}(f)>>cc(f)$? Fadel and Segal show that for every $f$, $cc_{IC}(f)\leq exp(cc(f))$. They also show that fully computing the incentive compatible mechanism is strictly harder than computing only the output: there exists a social choice function $f$ such that $cc_{IC}(f)=cc(f)+1$. In a follow-up work, Babaioff, Blumrosen, Naor, and Schapira [EC'08] provide a social choice function $f$ such that $cc_{IC}(f)=\Theta(n\cdot cc(f))$, where $n$ is the number of players. The question of whether the exponential upper bound of Fadel and Segal is tight remained wide open. In this paper we solve this question by explicitly providing an $f$ such that $cc_{IC}(f)= exp(cc(f))$. In fact, we establish this via two very different proofs. In contrast, we show that if the players are risk-neutral and we can compromise on a randomized truthful-in-expectation implementation (and not on deterministic ex-post implementation) gives that $cc_{TIE}(f)=poly(n,cc(f))$ for every function $f$, as long as the domain of $f$ is single parameter or a convex multi-parameter domain. We also provide efficient algorithms for deterministic computation of payments in several important domains.
Let (f,P) be an incentive compatible mechanism where f is the social choice function and P is the payment function. In many important settings, f uniquely determines P (up to … Let (f,P) be an incentive compatible mechanism where f is the social choice function and P is the payment function. In many important settings, f uniquely determines P (up to a constant) and therefore a common approach is to focus on the design of f and neglect the role of the payment function. Fadel and Segal [JET, 2009] question this approach by taking the lenses of communication complexity: can it be that the communication complexity of an incentive compatible mechanism that implements f (that is, computes both the output and the payments) is much larger than the communication complexity of computing the output? I.e., can it be that ccIC(f)>>cc(f)? Fadel and Segal show that for every f, ccIC(f)≤ exp(cc(f)). They also show that fully computing the incentive compatible mechanism is strictly harder than computing only the output: there exists a social choice function f such that ccIC(f)=cc(f)+1. In a follow-up work, Babaioff, Blumrosen, Naor, and Schapira [EC'08] provide a social choice function f such that ccIC(f)=Θ(n· cc(f)), where n is the number of players. The question of whether the exponential upper bound of Fadel and Segal is tight remained wide open. In this paper we solve this question by explicitly providing a function f such that ccIC(f)= exp(cc(f)). In fact, we establish this via two very different proofs. In contrast, we show that if the players are risk-neutral and we can compromise on a randomized truthful-in-expectation implementation (and not on deterministic ex-post implementation) gives that ccTIE(f)=poly(n,cc(f)) for every function f, as long as the domain of f is single parameter or a convex multi-parameter domain. We also provide efficient algorithms for deterministic computation of payments in several important domains.
We present fast algorithms for sketching valuation functions. Let $N$ ($|N|=n$) be some ground set and $v:2^N\rightarrow \mathbb R$ be a function. We say that $\tilde v:2^N\rightarrow \mathbb R$ is … We present fast algorithms for sketching valuation functions. Let $N$ ($|N|=n$) be some ground set and $v:2^N\rightarrow \mathbb R$ be a function. We say that $\tilde v:2^N\rightarrow \mathbb R$ is an $\alpha$-sketch of $v$ if for every set $S$ we have that $\frac {v(S)} {\alpha} \leq \tilde v(S) \leq v(S)$ and $\tilde v$ can be described in $poly(n)$ bits. Goemans et al. [SODA'09] showed that if $v$ is submodular then there exists an $\tilde O(\sqrt n)$-sketch that can be constructed using polynomially many value queries (this is the best possible, as Balcan and Harvey [STOC'11] show that no submodular function admit an $n^{\frac 1 3 - \epsilon}$-sketch). Based on their work, Balcan et al. [COLT'12] and Badanidiyuru et al. [SODA'12] show that if $v$ is subadditive then there exists an $\tilde O(\sqrt n)$-sketch that can be constructed using polynomially many demand queries. All previous sketches are based on complicated geometric constructions. The first step in their constructions is proving the existence of a good sketch by finding an ellipsoid that ``approximates'' $v$ well (this is done by applying John's theorem to ensure the existence of an ellipsoid that is ``close'' to the polymatroid that is associated with $v$). The second step is showing this ellipsoid can be found efficiently, and this is done by repeatedly solving a certain convex program to obtain better approximations of John's ellipsoid. In this paper, we give a much simpler, non-geometric proof for the existence of good sketches, and utilize the proof to obtain much faster algorithms that match the previously obtained approximation bounds. Specifically, we provide an algorithm that finds $\tilde O(\sqrt n)$-sketch of a submodular function with only $\tilde O(n^\frac{3}{2})$ value queries, and an algorithm that finds $\tilde O(\sqrt n)$-sketch of a subadditive function with $O(n)$ demand and value queries.
We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known … We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known cost function specifies the cost of providing every possible combination of services. A combinatorial cost sharing mechanism is a protocol that decides which services each player gets and at what price. We look for dominant strategy mechanisms that are (economically) efficient and cover the cost, ideally without overcharging (i.e., budget balanced). Note that unlike the standard cost sharing setting, combinatorial cost sharing is a multi-parameter domain. This makes designing dominant strategy mechanisms with good guarantees a challenging task.
One of the fundamental questions of Algorithmic Mechanism Design is whether there exists an inherent clash between truthfulness and computational tractability: in particular, whether polynomial-time truthful mechanisms for combinatorial auctions … One of the fundamental questions of Algorithmic Mechanism Design is whether there exists an inherent clash between truthfulness and computational tractability: in particular, whether polynomial-time truthful mechanisms for combinatorial auctions are provably weaker in terms of approximation ratio than non-truthful ones. This question was very recently answered for universally truthful mechanisms for combinatorial auctions \cite{D11}, and even for truthful-in-expectation mechanisms \cite{DughmiV11}. However, both of these results are based on information-theoretic arguments for valuations given by a value oracle, and leave open the possibility of polynomial-time truthful mechanisms for succinctly described classes of valuations. This paper is the first to prove {\em computational hardness} results for truthful mechanisms for combinatorial auctions with succinctly described valuations. We prove that there is a class of succinctly represented submodular valuations for which no deterministic truthful mechanism provides an $m^{1/2-\epsilon}$-approximation for a constant $\epsilon>0$, unless $NP=RP$ ($m$ denotes the number of items). Furthermore, we prove that even truthful-in-expectation mechanisms cannot approximate combinatorial auctions with certain succinctly described submodular valuations better than within $n^\gamma$, where $n$ is the number of bidders and $\gamma>0$ some absolute constant, unless $NP \subseteq P/poly$. In addition, we prove computational hardness results for two related problems.
The problem of scheduling unrelated machines by a truthful mechanism to minimize the makespan was introduced in the seminal "Algorithmic Mechanism Design" paper by Nisan and Ronen. Nisan and Ronen … The problem of scheduling unrelated machines by a truthful mechanism to minimize the makespan was introduced in the seminal "Algorithmic Mechanism Design" paper by Nisan and Ronen. Nisan and Ronen showed that there is a truthful mechanism that provides an approximation ratio of $\min(m,n)$, where $n$ is the number of machines and $m$ is the number of jobs. They also proved that no truthful mechanism can provide an approximation ratio better than $2$. Since then, the lower bound was improved to $1 +\sqrt 2 \approx 2.41$ by Christodoulou, Kotsoupias, and Vidali, and then to $1+ϕ\approx 2.618$ by Kotsoupias and Vidali. Very recently, the lower bound was improved to $2.755$ by Giannakopoulos, Hammerl, and Pocas. In this paper we further improve the bound to $3-δ$, for every constant $δ>0$. Note that a gap between the upper bound and the lower bounds exists even when the number of machines and jobs is very small. In particular, the known $1+\sqrt{2}$ lower bound requires at least $3$ machines and $5$ jobs. In contrast, we show a lower bound of $2.2055$ that uses only $3$ machines and $3$ jobs and a lower bound of $1+\sqrt 2$ that uses only $3$ machines and $4$ jobs. For the case of two machines and two jobs we show a lower bound of $2$. Similar bounds for two machines and two jobs were known before but only via complex proofs that characterized all truthful mechanisms that provide a finite approximation ratio in this setting, whereas our new proof uses a simple and direct approach.
We study a central problem in Algorithmic Mechanism Design: constructing truthful mechanisms for welfare maximization in combinatorial auctions with submodular bidders. Dobzinski, Nisan, and Schapira provided the first mechanism that … We study a central problem in Algorithmic Mechanism Design: constructing truthful mechanisms for welfare maximization in combinatorial auctions with submodular bidders. Dobzinski, Nisan, and Schapira provided the first mechanism that guarantees a non-trivial approximation ratio of $O(\log^2 m)$ [STOC'06], where $m$ is the number of items. This was subsequently improved to $O(\log m\log \log m)$ [Dobzinski, APPROX'07] and then to $O(\log m)$ [Krysta and Vocking, ICALP'12]. In this paper we develop the first mechanism that breaks the logarithmic barrier. Specifically, the mechanism provides an approximation ratio of $O(\sqrt {\log m})$. Similarly to previous constructions, our mechanism uses polynomially many value and demand queries, and in fact provides the same approximation ratio for the larger class of XOS (a.k.a. fractionally subadditive) valuations. We also develop a computationally efficient implementation of the mechanism for combinatorial auctions with budget additive bidders. Although in general computing a demand query is NP-hard for budget additive valuations, we observe that the specific form of demand queries that our mechanism uses can be efficiently computed when bidders are budget additive.
We exhibit incentive compatible multi-unit auctions that are not affine maximizers (i.e., are not of the VCG family) and yet approximate the social welfare to within a factor of $1+\epsilon$. … We exhibit incentive compatible multi-unit auctions that are not affine maximizers (i.e., are not of the VCG family) and yet approximate the social welfare to within a factor of $1+\epsilon$. For the case of two-item two-bidder auctions we show that these auctions, termed Triage auctions, are the only scalable ones that give an approximation factor better than 2. "Scalable" means that the allocation does not depend on the units in which the valuations are measured. We deduce from this that any scalable computationally-efficient incentive-compatible auction for $m$ items and $n \ge 2$ bidders cannot approximate the social welfare to within a factor better than 2. This is in contrast to arbitrarily good approximations that can be reached under computational constraints alone, and in contrast to the fact that the optimal social welfare can be obtained under incentive constraints alone.
We study incentive-compatible mechanisms that maximize the Nash Social Welfare. Since traditional incentive-compatible mechanisms cannot maximize the Nash Social Welfare even approximately, we propose changing the traditional model. Inspired by … We study incentive-compatible mechanisms that maximize the Nash Social Welfare. Since traditional incentive-compatible mechanisms cannot maximize the Nash Social Welfare even approximately, we propose changing the traditional model. Inspired by a widely used charging method (e.g., royalties, a lawyer that charges some percentage of possible future compensation), we suggest charging the players some percentage of their value of the outcome. We call this model the percentage fee model.
We present a constant-factor approximation algorithm for the Nash Social Welfare (NSW) maximization problem with subadditive valuations accessible via demand queries. More generally, we propose a framework for NSW optimization … We present a constant-factor approximation algorithm for the Nash Social Welfare (NSW) maximization problem with subadditive valuations accessible via demand queries. More generally, we propose a framework for NSW optimization which assumes two subroutines which (1) solve a configuration-type LP under certain additional conditions, and (2) round the fractional solution with respect to utilitarian social welfare. In particular, a constant-factor approximation for submodular valuations with value queries can also be derived from our framework.
We study the bilateral trade problem where a seller owns a single indivisible item, and a potential buyer seeks to purchase it. Previous mechanisms for this problem only considered the … We study the bilateral trade problem where a seller owns a single indivisible item, and a potential buyer seeks to purchase it. Previous mechanisms for this problem only considered the case where the values of the buyer and the seller are drawn from independent distributions. In contrast, this paper studies bilateral trade mechanisms when the values are drawn from a joint distribution. We prove that the buyer-offering mechanism guarantees an approximation ratio of e/e−1 ≈ 1.582 to the social welfare even if the values are drawn from a joint distribution. The buyer-offering mechanism is Bayesian incentive compatible, but the seller has a dominant strategy. We prove the buyer-offering mechanism is optimal in the sense that no Bayesian mechanism where one of the players has a dominant strategy can obtain an approximation ratio better than e/e−1. We also show that no mechanism in which both sides have a dominant strategy can provide any constant approximation to the social welfare when the values are drawn from a joint distribution. Finally, we prove some impossibility results on the power of general Bayesian incentive compatible mechanisms. In particular, we show that no deterministic Bayesian incentive-compatible mechanism can provide an approximation ratio better than 1+ln2/2≈ 1.346.
We present a constant-factor approximation algorithm for the Nash Social Welfare (NSW) maximization problem with subadditive valuations accessible via demand queries. More generally, we propose a framework for NSW optimization … We present a constant-factor approximation algorithm for the Nash Social Welfare (NSW) maximization problem with subadditive valuations accessible via demand queries. More generally, we propose a framework for NSW optimization which assumes two subroutines which (1) solve a configuration-type LP under certain additional conditions, and (2) round the fractional solution with respect to utilitarian social welfare. In particular, a constant-factor approximation for submodular valuations with value queries can also be derived from our framework.
We study incentive-compatible mechanisms that maximize the Nash Social Welfare. Since traditional incentive-compatible mechanisms cannot maximize the Nash Social Welfare even approximately, we propose changing the traditional model. Inspired by … We study incentive-compatible mechanisms that maximize the Nash Social Welfare. Since traditional incentive-compatible mechanisms cannot maximize the Nash Social Welfare even approximately, we propose changing the traditional model. Inspired by a widely used charging method (e.g., royalties, a lawyer that charges some percentage of possible future compensation), we suggest charging the players some percentage of their value of the outcome. We call this model the \emph{percentage fee} model. We show that there is a mechanism that maximizes exactly the Nash Social Welfare in every setting with non-negative valuations. Moreover, we prove an analog of Roberts theorem that essentially says that if the valuations are non-negative, then the only implementable social choice functions are those that maximize weighted variants of the Nash Social Welfare. We develop polynomial time incentive compatible approximation algorithms for the Nash Social Welfare with subadditive valuations and prove some hardness results.
In this paper we revisit the notion of simplicity in mechanisms. We consider a seller of m heterogeneous items, facing a single buyer with valuation v. We observe that previous … In this paper we revisit the notion of simplicity in mechanisms. We consider a seller of m heterogeneous items, facing a single buyer with valuation v. We observe that previous attempts to define complexity measures often fail to classify mechanisms that are intuitively considered simple (e.g., the "selling separately" mechanism) as such. We suggest to view a menu as simple if a bundle that maximizes the buyer's profit can be found by conducting a few primitive operations that are considered simple. The primitive complexity of a menu is the number of primitive operations needed to (adaptively) find a profit-maximizing entry in the menu. In this paper, the primitive operation that we study is essentially computing the outcome of the "selling separately" mechanism.
We study incentive-compatible mechanisms that maximize the Nash Social Welfare. Since traditional incentive-compatible mechanisms cannot maximize the Nash Social Welfare even approximately, we propose changing the traditional model. Inspired by … We study incentive-compatible mechanisms that maximize the Nash Social Welfare. Since traditional incentive-compatible mechanisms cannot maximize the Nash Social Welfare even approximately, we propose changing the traditional model. Inspired by a widely used charging method (e.g., royalties, a lawyer that charges some percentage of possible future compensation), we suggest charging the players some percentage of their value of the outcome. We call this model the percentage fee model.
We explore the performance of polynomial-time incentive-compatible mechanisms in single-crossing domains. Single-crossing domains were extensively studied in the economics literature. Roughly speaking, a domain is single crossing if monotonicity characterizes … We explore the performance of polynomial-time incentive-compatible mechanisms in single-crossing domains. Single-crossing domains were extensively studied in the economics literature. Roughly speaking, a domain is single crossing if monotonicity characterizes incentive compatibility (intuitively, an algorithm is monotone if a bidder that "improves" his valuation is allocated a better outcome). That is, single-crossing domains are the standard mathematical formulation of domains that are informally known as "single parameter". In all major single-crossing domains studied so far (e.g., welfare maximization in various auctions with single-minded bidders, makespan minimization on related machines), the performance of the best polynomial-time incentive-compatible mechanisms matches the performance of the best polynomial-time non-incentive-compatible algorithms. Our two main results make progress in understanding the power of incentive-compatible polynomial-time mechanisms in single-crossing domains:
Abstract The problem of scheduling unrelated machines by a truthful mechanism to minimize the makespan was introduced in the seminal "Algorithmic Mechanism Design" paper by Nisan and Ronen. Nisan and … Abstract The problem of scheduling unrelated machines by a truthful mechanism to minimize the makespan was introduced in the seminal "Algorithmic Mechanism Design" paper by Nisan and Ronen. Nisan and Ronen showed that there is a truthful mechanism that provides an approximation ratio of min( m , n ), where n is the number of machines and m is the number of jobs. They also proved that no truthful mechanism can provide an approximation ratio better than 2. Since then, the lower bound was improved to 1 + √2 ≈ 2.41 by Christodoulou, Kotsoupias, and Vidali, and then to 1 + Φ ≈ 2.618 by Kotsoupias and Vidali. The lower bound was improved to 2.755 by Giannakopoulos, Hammerl, and Pocas. In this paper we further improve the bound to 3- δ , for every constant δ > 0. Note that a gap between the upper bound and the lower bounds exists even when the number of machines and jobs is very small. In particular, the known 1 + √2 lower bound requires at least 3 machines and 5 jobs. In contrast, we show a lower bound of 2.2055 that uses only 3 machines and 3 jobs and a lower bound of 1 + √2 that uses only 3 machines and 4 jobs. For the case of two machines and two jobs we show a lower bound of 2. Similar bounds for two machines and two jobs were known before but only via complex proofs that characterized all truthful mechanisms that provide a finite approximation ratio in this setting, whereas our new proof uses a simple and direct approach.
We explore the performance of polynomial-time incentive-compatible mechanisms in single-crossing domains. Single-crossing domains were extensively studied in the economics literature. Roughly speaking, a domain is single crossing if monotonicity characterizes … We explore the performance of polynomial-time incentive-compatible mechanisms in single-crossing domains. Single-crossing domains were extensively studied in the economics literature. Roughly speaking, a domain is single crossing if monotonicity characterizes incentive compatibility. That is, single-crossing domains are the standard mathematical formulation of domains that are informally known as ``single parameter''. In all major single-crossing domains studied so far (e.g., welfare maximization in various auctions with single-minded bidders, makespan minimization on related machines), the performance of the best polynomial-time incentive-compatible mechanisms matches the performance of the best polynomial-time non-incentive-compatible algorithms. Our two main results make progress in understanding the power of incentive-compatible polynomial-time mechanisms in single-crossing domains: We provide the first proof of a gap in the power of polynomial-time incentive-compatible mechanisms and polynomial-time non-incentive-compatible algorithms: we present an objective function in a single-crossing multi-unit auction for which there is a polynomial-time algorithm that provides an approximation ratio of $\frac{1}{2}$, yet no polynomial-time incentive-compatible mechanism provides a finite approximation (under standard computational complexity assumptions). The objective function used above is not natural. We show that to some extent this is unavoidable by providing a sweeping positive result for the most natural objective function in multi-unit auctions, that of welfare maximization. We present an incentive-compatible FPTAS mechanism for every multi-unit auction with single-crossing domains. This improves over the mechanism of Briest et al. [STOC'05] that only applies to the much simpler case of single-minded bidders.
We study the bilateral trade problem where a seller owns a single indivisible item, and a potential buyer seeks to purchase it. Previous mechanisms for this problem only considered the … We study the bilateral trade problem where a seller owns a single indivisible item, and a potential buyer seeks to purchase it. Previous mechanisms for this problem only considered the case where the values of the buyer and the seller are drawn from independent distributions. In this paper, we study bilateral trade mechanisms when the values are drawn from a joint distribution. We prove that the buyer-offering mechanism guarantees an approximation ratio of $\frac e {e-1} \approx 1.582$ to the social welfare even if the values are drawn from a joint distribution. The buyer-offering mechanism is Bayesian incentive compatible, but the seller has a dominant strategy. We prove the buyer-offering mechanism is optimal in the sense that no Bayesian mechanism where one of the players has a dominant strategy can obtain an approximation ratio better than $\frac e {e-1}$. We also show that no mechanism in which both sides have a dominant strategy can provide any constant approximation to the social welfare when the values are drawn from a joint distribution. Finally, we prove some impossibility results on the power of general Bayesian incentive compatible mechanisms. In particular, we show that no deterministic Bayesian incentive-compatible mechanism can provide an approximation ratio better than $1+\frac {\ln 2} 2\approx 1.346$.
We present a constant-factor approximation algorithm for the Nash social welfare maximization problem with subadditive valuations accessible via demand queries. More generally, we propose a template for NSW optimization by … We present a constant-factor approximation algorithm for the Nash social welfare maximization problem with subadditive valuations accessible via demand queries. More generally, we propose a template for NSW optimization by solving a configuration-type LP and using a rounding procedure for (utilitarian) social welfare as a blackbox, which could be applicable to other variants of the problem.
We study the communication complexity of dominant strategy implementations of combinatorial auctions. We start with two domains that are generally considered "easy": multi-unit auctions with decreasing marginal values and combinatorial … We study the communication complexity of dominant strategy implementations of combinatorial auctions. We start with two domains that are generally considered "easy": multi-unit auctions with decreasing marginal values and combinatorial auctions with gross substitutes valuations. For both domains we have fast algorithms that find the welfare-maximizing allocation with communication complexity that is poly-logarithmic in the input size. This immediately implies that welfare maximization can be achieved in ex-post equilibrium with no significant communication cost, by using VCG payments. In contrast, we show that in both domains the communication complexity of any dominant strategy implementation that achieves the optimal welfare is polynomial in the input size.
In this paper we revisit the notion of simplicity in mechanisms. We consider a seller of $m$ items, facing a single buyer with valuation $v$. We observe that previous attempts … In this paper we revisit the notion of simplicity in mechanisms. We consider a seller of $m$ items, facing a single buyer with valuation $v$. We observe that previous attempts to define complexity measures often fail to classify mechanisms that are intuitively considered simple (e.g., the "selling separately" mechanism) as such. We suggest to view a menu as simple if a bundle that maximizes the buyer's profit can be found by conducting a few primitive operations that are considered simple. The \emph{primitive complexity of a menu} is the number of primitive operations needed to (adaptively) find a profit-maximizing entry in the menu. In this paper, the primitive operation that we study is essentially computing the outcome of the "selling separately" mechanism. Does the primitive complexity capture the simplicity of other auctions that are intuitively simple? We consider \emph{bundle-size pricing}, a common pricing method in which the price of a bundle depends only on its size. Our main technical contribution is determining the primitive complexity of bundle-size pricing menus in various settings. We show that for any distribution $\cal D$ over weighted matroid rank valuations, even distributions with arbitrary correlation among their values, there is always a bundle-size pricing menu with low primitive complexity that achieves almost the same revenue as the optimal bundle-size pricing menu. As part of this proof we provide a randomized algorithm that for any weighted matroid rank valuation $v$ and integer $k$, finds the most valuable set of size $k$ with only a poly-logarithmic number of demand and value queries. We show that this result is essentially tight in several aspects. For example, if the valuation $v$ is submodular, then finding the most valuable set of size $k$ requires exponentially many queries (this solves an open question of Badanidiyuru et al. [EC'12]).
We study the communication complexity of dominant strategy implementations of combinatorial auctions. We start with two domains that are generally considered "easy": multi-unit auctions with decreasing marginal values and combinatorial … We study the communication complexity of dominant strategy implementations of combinatorial auctions. We start with two domains that are generally considered "easy": multi-unit auctions with decreasing marginal values and combinatorial auctions with gross substitutes valuations. For both domains we have fast algorithms that find the welfare-maximizing allocation with communication complexity that is poly-logarithmic in the input size. This immediately implies that welfare maximization can be achieved in ex-post equilibrium with no significant communication cost, by using VCG payments. In contrast, we show that in both domains the communication complexity of any dominant strategy implementation that achieves the optimal welfare is polynomial in the input size. We then move on to studying the approximation ratios achievable by dominant strategy mechanisms. For multi-unit auctions with decreasing marginal values, we provide a dominant-strategy communication FPTAS. For combinatorial auctions with general valuations, we show that there is no dominant strategy mechanism that achieves an approximation ratio better than $m^{1-\epsilon}$ that uses $poly(m,n)$ bits of communication, where $m$ is the number of items and $n$ is the number of bidders. In contrast, a \emph{randomized} dominant strategy mechanism that achieves an $O(\sqrt m)$ approximation with $poly(m,n)$ communication is known. This proves the first gap between computationally efficient deterministic dominant strategy mechanisms and randomized ones. En route, we answer an open question on the communication cost of implementing dominant strategy mechanisms for more than two players, and also solve some open problems in the area of simultaneous combinatorial auctions.
We introduce the notion of rigidity in auction design and use it to analyze some fundamental aspects of mechanism design. We focus on single-item auctions where the values of the … We introduce the notion of rigidity in auction design and use it to analyze some fundamental aspects of mechanism design. We focus on single-item auctions where the values of the bidders are drawn from some (possibly correlated) distribution $\mathcal F$. Let $f$ be the allocation function of an optimal mechanism for $\mathcal F$. Informally, $S$ is (linearly) rigid in $\mathcal F$ if for every mechanism $M'$ with an allocation function $f'$ where $f$ and $f'$ agree on the allocation of at most $x$-fraction of the instances of $S$, the expected revenue of $M'$ is at most an $x$ fraction of the optimal revenue. We use rigidity to explain the singular success of Cremer and McLean's auction. Recall that the revenue of Cremer and McLean's auction is the optimal welfare if the distribution obeys a certain ``full rank'' condition, but no analogous constructions are known if this condition does not hold. Note that the Kolmogorov complexity of the allocation function of Cremer and McLean's auction is logarithmic, whereas we use rigidity to show that for some distributions that do not obey the full rank condition, the Kolmogorov complexity of the allocation function of every mechanism that provides a constant approximation is almost linear. We further investigate rigidity assuming different notions of individual rationality. Assuming ex-post individual rationality, if there is a rigid set, the structure of the optimal mechanism is simple: the player with the highest value ``usually'' wins the item and contributes most of the revenue. In contrast, assuming interim individual rationality, there are distributions with a rigid set $S$ where the optimal mechanism has no obvious allocation pattern (i.e., its Kolmogorov complexity is high). Our results help explain why we have little hope of developing good, simple and generic approximation mechanisms in the interim individual rationality world.
A rapidly growing literature on lying in behavioral economics and psychology shows that individuals often do not lie even when lying maximizes their utility. In this work, we attempt to … A rapidly growing literature on lying in behavioral economics and psychology shows that individuals often do not lie even when lying maximizes their utility. In this work, we attempt to incorporate these findings into the theory of mechanism design. We consider players that have a preference for truth-telling and will only lie if their benefit from lying is sufficiently larger than the loss of the others. To accommodate such players, we introduce $\alpha$-moral mechanisms, in which the gain of a player from misreporting his true value, comparing to truth-telling, is at most $\alpha$ times the loss that the others incur due to misreporting. We develop a theory of moral mechanisms in the canonical setting of single-item auctions. We identify similarities and disparities to the standard theory of truthful mechanisms. In particular, we show that the allocation function does not uniquely determine the payments and is unlikely to admit a simple characterization. In contrast, recall that monotonicity characterizes the allocation function of truthful mechanisms. Our main technical effort is invested in determining whether the auctioneer can exploit the preference for truth-telling of the players to extract more revenue comparing to truthful mechanisms. We show that the auctioneer can extract more revenue when the values of the players are correlated, even when there are only two players. However, we show that truthful mechanisms are revenue-maximizing even among moral ones when the values of the players are independently drawn from certain identical distributions. As a by product we get an alternative proof to Myerson's characterization in the settings that we consider. We flesh out this approach by providing an alternative proof to Myerson's characterization that does not involve moral mechanisms whenever the values are independently drawn from regular distributions.
We study the classic bilateral trade setting. Myerson and Satterthwaite show that there is no Bayesian incentive compatible and budget-balanced mechanism that obtains the gains from trade of the first-best … We study the classic bilateral trade setting. Myerson and Satterthwaite show that there is no Bayesian incentive compatible and budget-balanced mechanism that obtains the gains from trade of the first-best mechanism. Consider the random-offerer mechanism: with probability $\frac{1}{2}$ run the \emph{seller-offering} mechanism, in which the seller offers the buyer a take-it-or-leave-it price that maximizes the expected profit of the seller, and with probability $\frac{1}{2}$ run the \emph{buyer-offering} mechanism. Very recently, Deng, Mao, Sivan, and Wang showed that the gains from trade of the random-offerer mechanism is at least a constant factor of $\frac 1 {8.23}\approx 0.121$ of the gains from trade of the first best mechanism. Perhaps a natural conjecture is that the gains-from-trade of the random-offerer mechanism, which is known to be at least half of the gains-from-trade of the second-best mechanism, is also at least half of the gains-from-trade of the first-best mechanism. However, in this note we exhibit distributions such as the gains-from trade of the random-offerer mechanism is smaller than a $0.495$-fraction of the gains-from-trade of the first-best mechanism.
Consider a seller that intends to auction some item. The seller can invest money and effort in advertising in different market segments in order to recruit n bidders to the … Consider a seller that intends to auction some item. The seller can invest money and effort in advertising in different market segments in order to recruit n bidders to the auction. Alternatively, the seller can have a much cheaper and focused marketing operation and recruit the same number of bidders from a single market segment. Which marketing operation should the seller choose?
The class of gross substitutes (GS) set functions plays a central role in Economics and Computer Science. GS belongs to the hierarchy of complement free valuations introduced by Lehmann, Lehmann … The class of gross substitutes (GS) set functions plays a central role in Economics and Computer Science. GS belongs to the hierarchy of complement free valuations introduced by Lehmann, Lehmann and Nisan, along with other prominent classes: GS ⊊ Submodular ⊊ XOS ⊊ Subadditive$. The GS class has always been more enigmatic than its counterpart classes, both in its definition and in its relation to the other classes. For example, while it is well understood how closely the Submodular, XOS and Subadditive classes (point-wise) approximate one another, approximability of these classes by GS remained wide open. In particular, the largest gap known between Submodular and GS valuations was some constant ratio smaller than 2. Our main result is the existence of a submodular valuation (one that is also budget additive) that cannot be approximated by GS within a ratio better than $Ømega(łog m/łogłog m), where m is the number of items. En route, we uncover a new symmetrization operation that preserves GS, which may be of independent interest. We show that our main result is tight with respect to budget additive valuations. However, whether GS approximates general submodular valuations within a poly-logarithmic factor remains open, even in the special case of concave of GS valuations (a subclass of Submodular containing budget additive). For concave of Rado valuations (Rado is a significant subclass of GS, containing, e.g., weighted matroid rank functions and OXS), we show approximability by GS within an O(łog2m) factor.
Let (f,P) be an incentive compatible mechanism where f is the social choice function and P is the payment function. In many important settings, f uniquely determines P (up to … Let (f,P) be an incentive compatible mechanism where f is the social choice function and P is the payment function. In many important settings, f uniquely determines P (up to a constant) and therefore a common approach is to focus on the design of f and neglect the role of the payment function. Fadel and Segal [JET, 2009] question this approach by taking the lenses of communication complexity: can it be that the communication complexity of an incentive compatible mechanism that implements f (that is, computes both the output and the payments) is much larger than the communication complexity of computing the output? I.e., can it be that ccIC(f)>>cc(f)? Fadel and Segal show that for every f, ccIC(f)≤ exp(cc(f)). They also show that fully computing the incentive compatible mechanism is strictly harder than computing only the output: there exists a social choice function f such that ccIC(f)=cc(f)+1. In a follow-up work, Babaioff, Blumrosen, Naor, and Schapira [EC'08] provide a social choice function f such that ccIC(f)=Θ(n· cc(f)), where n is the number of players. The question of whether the exponential upper bound of Fadel and Segal is tight remained wide open. In this paper we solve this question by explicitly providing a function f such that ccIC(f)= exp(cc(f)). In fact, we establish this via two very different proofs. In contrast, we show that if the players are risk-neutral and we can compromise on a randomized truthful-in-expectation implementation (and not on deterministic ex-post implementation) gives that ccTIE(f)=poly(n,cc(f)) for every function f, as long as the domain of f is single parameter or a convex multi-parameter domain. We also provide efficient algorithms for deterministic computation of payments in several important domains.
Consider a seller that intends to auction some item. The seller can invest money and effort in advertising in different market segments in order to recruit $n$ bidders to the … Consider a seller that intends to auction some item. The seller can invest money and effort in advertising in different market segments in order to recruit $n$ bidders to the auction. Alternatively, the seller can have a much cheaper and focused marketing operation and recruit the same number of bidders from a single market segment. Which marketing operation should the seller choose? More formally, let $D=\{\mathcal D_1,\ldots, \mathcal D_n\}$ be a set of distributions. Our main result shows that there is always $\mathcal D_i\in D$ such that the revenue that can be extracted from $n$ bidders, where the value of each is independently drawn from $\mathcal D_i$, is at least $\frac 1 2 \cdot (1-\frac 1 e)$ of the revenue that can be obtained by any possible mix of bidders, where the value of each bidder is drawn from some (possibly different) distribution that belongs to $D$. We next consider situations in which the auctioneer cannot use the optimal auction and is required to use a second price auction. We show that there is always $\mathcal D_i\in D$ such that if the value of all bidders is independently drawn from $\mathcal D_i$ then running a second price auction guarantees a constant fraction of the revenue that can be obtained by a second-price auction by any possible mix of bidders. Finally, we show that for any $\varepsilon>0$ there exists a function $f$ that depends only on $\varepsilon$ (in particular, the function does not depend on $n$ or on the set $D$), such that recruiting $n$ bidders which have at most $f(\varepsilon)$ different distributions, all from $D$, guarantees $(1-\varepsilon)$-fraction of the revenue that can be obtained by a second-price auction by any possible mix of bidders.
A rapidly growing literature on lying in behavioral economics and psychology shows that individuals often do not lie even when lying maximizes their utility. In this work, we attempt to … A rapidly growing literature on lying in behavioral economics and psychology shows that individuals often do not lie even when lying maximizes their utility. In this work, we attempt to incorporate these findings into the theory of mechanism design. We consider players that have a preference for truth-telling and will only lie if their benefit from lying is sufficiently larger than the loss of the others. To accommodate such players, we introduce $\alpha$-moral mechanisms, in which the gain of a player from misreporting his true value, comparing to truth-telling, is at most $\alpha$ times the loss that the others incur due to misreporting. We develop a theory of moral mechanisms in the canonical setting of single-item auctions. We identify similarities and disparities to the standard theory of truthful mechanisms. In particular, we show that the allocation function does not uniquely determine the payments and is unlikely to admit a simple characterization. In contrast, recall that monotonicity characterizes the allocation function of truthful mechanisms. Our main technical effort is invested in determining whether the auctioneer can exploit the preference for truth-telling of the players to extract more revenue comparing to truthful mechanisms. We show that the auctioneer can extract more revenue when the values of the players are correlated, even when there are only two players. However, we show that truthful mechanisms are revenue-maximizing even among moral ones when the values of the players are independently drawn from certain identical distributions. As a by product we get an alternative proof to Myerson's characterization in the settings that we consider. We flesh out this approach by providing an alternative proof to Myerson's characterization that does not involve moral mechanisms whenever the values are independently drawn from regular distributions.
The class of gross substitutes (GS) set functions plays a central role in Economics and Computer Science. GS belongs to the hierarchy of {\em complement free} valuations introduced by Lehmann, … The class of gross substitutes (GS) set functions plays a central role in Economics and Computer Science. GS belongs to the hierarchy of {\em complement free} valuations introduced by Lehmann, Lehmann and Nisan, along with other prominent classes: $GS \subsetneq Submodular \subsetneq XOS \subsetneq Subadditive$. The GS class has always been more enigmatic than its counterpart classes, both in its definition and in its relation to the other classes. For example, while it is well understood how closely the Submodular, XOS and Subadditive classes (point-wise) approximate one another, approximability of these classes by GS remained wide open. Our main result is the existence of a submodular valuation (one that is also budget additive) that cannot be approximated by GS within a ratio better than $\Omega(\frac{\log m}{\log\log m})$, where $m$ is the number of items. En route, we uncover a new symmetrization operation that preserves GS, which may be of independent interest. We show that our main result is tight with respect to budget additive valuations. Additionally, for a class of submodular functions that we refer to as {\em concave of Rado} valuations (this class greatly extends budget additive valuations), we show approximability by GS within an $O(\log^2m)$ factor.
We study the classic bilateral trade setting. Myerson and Satterthwaite show that there is no Bayesian incentive compatible and budget-balanced mechanism that obtains the gains from trade of the first-best … We study the classic bilateral trade setting. Myerson and Satterthwaite show that there is no Bayesian incentive compatible and budget-balanced mechanism that obtains the gains from trade of the first-best mechanism. Consider the random-offerer mechanism: with probability $\frac{1}{2}$ run the \emph{seller-offering} mechanism, in which the seller offers the buyer a take-it-or-leave-it price that maximizes the expected profit of the seller, and with probability $\frac{1}{2}$ run the \emph{buyer-offering} mechanism. Very recently, Deng, Mao, Sivan, and Wang showed that the gains from trade of the random-offerer mechanism is at least a constant factor of $\frac 1 {8.23}\approx 0.121$ of the gains from trade of the first best mechanism. Perhaps a natural conjecture is that the gains-from-trade of the random-offerer mechanism, which is known to be at least half of the gains-from-trade of the second-best mechanism, is also at least half of the gains-from-trade of the first-best mechanism. However, in this note we exhibit distributions such as the gains-from trade of the random-offerer mechanism is smaller than a $0.495$-fraction of the gains-from-trade of the first-best mechanism.
Let $(f,P)$ be an incentive compatible mechanism where $f$ is the social choice function and $P$ is the payment function. In many important settings, $f$ uniquely determines $P$ (up to … Let $(f,P)$ be an incentive compatible mechanism where $f$ is the social choice function and $P$ is the payment function. In many important settings, $f$ uniquely determines $P$ (up to a constant) and therefore a common approach is to focus on the design of $f$ and neglect the role of the payment function. Fadel and Segal [JET, 2009] question this approach by taking the lenses of communication complexity: can it be that the communication complexity of an incentive compatible mechanism that implements $f$ (that is, computes both the output and the payments) is much larger than the communication complexity of computing the output? I.e., can it be that $cc_{IC}(f)>>cc(f)$? Fadel and Segal show that for every $f$, $cc_{IC}(f)\leq exp(cc(f))$. They also show that fully computing the incentive compatible mechanism is strictly harder than computing only the output: there exists a social choice function $f$ such that $cc_{IC}(f)=cc(f)+1$. In a follow-up work, Babaioff, Blumrosen, Naor, and Schapira [EC'08] provide a social choice function $f$ such that $cc_{IC}(f)=\Theta(n\cdot cc(f))$, where $n$ is the number of players. The question of whether the exponential upper bound of Fadel and Segal is tight remained wide open. In this paper we solve this question by explicitly providing an $f$ such that $cc_{IC}(f)= exp(cc(f))$. In fact, we establish this via two very different proofs. In contrast, we show that if the players are risk-neutral and we can compromise on a randomized truthful-in-expectation implementation (and not on deterministic ex-post implementation) gives that $cc_{TIE}(f)=poly(n,cc(f))$ for every function $f$, as long as the domain of $f$ is single parameter or a convex multi-parameter domain. We also provide efficient algorithms for deterministic computation of payments in several important domains.
Let $(f,P)$ be an incentive compatible mechanism where $f$ is the social choice function and $P$ is the payment function. In many important settings, $f$ uniquely determines $P$ (up to … Let $(f,P)$ be an incentive compatible mechanism where $f$ is the social choice function and $P$ is the payment function. In many important settings, $f$ uniquely determines $P$ (up to a constant) and therefore a common approach is to focus on the design of $f$ and neglect the role of the payment function. Fadel and Segal [JET, 2009] question this approach by taking the lenses of communication complexity: can it be that the communication complexity of an incentive compatible mechanism that implements $f$ (that is, computes both the output and the payments) is much larger than the communication complexity of computing the output? I.e., can it be that $cc_{IC}(f)>>cc(f)$? Fadel and Segal show that for every $f$, $cc_{IC}(f)\leq exp(cc(f))$. They also show that fully computing the incentive compatible mechanism is strictly harder than computing only the output: there exists a social choice function $f$ such that $cc_{IC}(f)=cc(f)+1$. In a follow-up work, Babaioff, Blumrosen, Naor, and Schapira [EC'08] provide a social choice function $f$ such that $cc_{IC}(f)=\Theta(n\cdot cc(f))$, where $n$ is the number of players. The question of whether the exponential upper bound of Fadel and Segal is tight remained wide open. In this paper we solve this question by explicitly providing an $f$ such that $cc_{IC}(f)= exp(cc(f))$. In fact, we establish this via two very different proofs. In contrast, we show that if the players are risk-neutral and we can compromise on a randomized truthful-in-expectation implementation (and not on deterministic ex-post implementation) gives that $cc_{TIE}(f)=poly(n,cc(f))$ for every function $f$, as long as the domain of $f$ is single parameter or a convex multi-parameter domain. We also provide efficient algorithms for deterministic computation of payments in several important domains.
The problem of scheduling unrelated machines by a truthful mechanism to minimize the makespan was introduced in the seminal "Algorithmic Mechanism Design" paper by Nisan and Ronen. Nisan and Ronen … The problem of scheduling unrelated machines by a truthful mechanism to minimize the makespan was introduced in the seminal "Algorithmic Mechanism Design" paper by Nisan and Ronen. Nisan and Ronen showed that there is a truthful mechanism that provides an approximation ratio of $\min(m,n)$, where $n$ is the number of machines and $m$ is the number of jobs. They also proved that no truthful mechanism can provide an approximation ratio better than $2$. Since then, the lower bound was improved to $1 +\sqrt 2 \approx 2.41$ by Christodoulou, Kotsoupias, and Vidali, and then to $1+ϕ\approx 2.618$ by Kotsoupias and Vidali. Very recently, the lower bound was improved to $2.755$ by Giannakopoulos, Hammerl, and Pocas. In this paper we further improve the bound to $3-δ$, for every constant $δ>0$. Note that a gap between the upper bound and the lower bounds exists even when the number of machines and jobs is very small. In particular, the known $1+\sqrt{2}$ lower bound requires at least $3$ machines and $5$ jobs. In contrast, we show a lower bound of $2.2055$ that uses only $3$ machines and $3$ jobs and a lower bound of $1+\sqrt 2$ that uses only $3$ machines and $4$ jobs. For the case of two machines and two jobs we show a lower bound of $2$. Similar bounds for two machines and two jobs were known before but only via complex proofs that characterized all truthful mechanisms that provide a finite approximation ratio in this setting, whereas our new proof uses a simple and direct approach.
Related DatabasesWeb of Science You must be logged in with an active subscription to view this.Article DataHistorySubmitted: 12 September 2016Accepted: 11 August 2017Published online: 21 October 2019Keywordscombinatorial auctions, truthful mechanisms, … Related DatabasesWeb of Science You must be logged in with an active subscription to view this.Article DataHistorySubmitted: 12 September 2016Accepted: 11 August 2017Published online: 21 October 2019Keywordscombinatorial auctions, truthful mechanisms, approximation algorithmsAMS Subject Headings91B99Publication DataISSN (print): 0097-5397ISSN (online): 1095-7111Publisher: Society for Industrial and Applied MathematicsCODEN: smjcat
We study a communication variant of local search. There is some fixed, commonly known graph G. Alice holds fA and Bob holds fB, both are functions that specify a value … We study a communication variant of local search. There is some fixed, commonly known graph G. Alice holds fA and Bob holds fB, both are functions that specify a value for each vertex. The goal is to find a local maximum of fA+fB with respect to G, i.e., a vertex v for which (fA+fB)(v)≥ (fA+fB)(u) for each neighbor u of v.
We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known … We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known cost function specifies the cost of providing every possible combination of services. A combinatorial cost sharing mechanism is a protocol that decides which services each player gets and at what price. We look for dominant strategy mechanisms that are (economically) efficient and cover the cost, ideally without overcharging (i.e., budget balanced). Note that unlike the standard cost sharing setting, combinatorial cost sharing is a multi-parameter domain. This makes designing dominant strategy mechanisms with good guarantees a challenging task.
We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known … We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known cost function specifies the cost of providing every possible combination of services. A combinatorial cost sharing mechanism is a protocol that decides which services each player gets and at what price. We look for dominant strategy mechanisms that are (economically) efficient and cover the cost, ideally without overcharging (i.e., budget balanced). Note that unlike the standard cost sharing setting, combinatorial cost sharing is a multi-parameter domain. This makes designing dominant strategy mechanisms with good guarantees a challenging task. We present the Potential Mechanism -- a combination of the VCG mechanism and a well-known tool from the theory of cooperative games: Hart and Mas-Colell's potential function. The potential mechanism is a dominant strategy mechanism that always covers the incurred cost. When the cost function is subadditive the same mechanism is also approximately efficient. Our main technical contribution shows that when the cost function is submodular the potential mechanism is approximately budget balanced in three settings: supermodular valuations, symmetric cost function and general symmetric valuations, and two players with general valuations.
We analyze the revenue loss due to market shrinkage. Specifically, consider a simple market with one item for sale and n bidders whose values are drawn from some joint distribution. … We analyze the revenue loss due to market shrinkage. Specifically, consider a simple market with one item for sale and n bidders whose values are drawn from some joint distribution. Suppose that the market shrinks as a single bidder retires from the market. Suppose furthermore that the value of this retiring bidder is fixed and always strictly smaller than the values of the other bidders. We show that even this slight decrease in competition might cause a significant fall of a multiplicative factor of 1/e+1 ~0.268 in the revenue that can be obtained by a dominant strategy ex-post individually rational mechanism. In particular, our results imply a solution to an open question that was posed by Dobzinski, Fu, and Kleinberg [STOC'11].
We study combinatorial auctions with bidders that exhibit endowment effect. In most of the previous work on cognitive biases in algorithmic game theory (e.g., [Kleinberg and Oren, EC'14] and its … We study combinatorial auctions with bidders that exhibit endowment effect. In most of the previous work on cognitive biases in algorithmic game theory (e.g., [Kleinberg and Oren, EC'14] and its follow-ups) the focus was on analyzing the implications and mitigating their negative consequences. In contrast, in this paper we show how in some cases cognitive biases can be harnessed to obtain better outcomes. Specifically, we study Walrasian equilibria in combinatorial markets. It is well known that Walrasian equilibria exist only in limited settings, e.g., when all valuations are gross substitutes, but fails to exist in more general settings, e.g., when the valuations are submodular. We consider combinatorial settings in which bidders exhibit the endowment effect, that is, their value for items increases with ownership. Our main result shows that when the valuations are submodular, even a mild degree of endowment effect is sufficient to guarantee the existence of Walrasian equilibria. In fact, we show that in contrast to Walrasian equilibria with standard utility maximizing bidders -- in which the equilibrium allocation must be efficient -- when bidders exhibit endowment effect any local optimum can be an equilibrium allocation. Our techniques reveal interesting connections between the LP relaxation of combinatorial auctions and local maxima. We also provide lower bounds on the intensity of the endowment effect that the bidders must have in order to guarantee the existence of a Walrasian equilibrium in various settings.
We study the following communication variant of local search. There is some fixed, commonly known graph $G$. Alice holds $f_A$ and Bob holds $f_B$, both are functions that specify a … We study the following communication variant of local search. There is some fixed, commonly known graph $G$. Alice holds $f_A$ and Bob holds $f_B$, both are functions that specify a value for each vertex. The goal is to find a local maximum of $f_A+f_B$ with respect to $G$, i.e., a vertex $v$ for which $(f_A+f_B)(v)\geq (f_A+f_B)(u)$ for every neighbor $u$ of $v$. Our main result is that finding a local maximum requires polynomial (in the number of vertices) bits of communication. The result holds for the following families of graphs: three dimensional grids, hypercubes, odd graphs, and degree 4 graphs. Moreover, we provide an \emph{optimal} communication bound of $\Omega(\sqrt{N})$ for the hypercube, and for a constant dimensional greed, where $N$ is the number of vertices in the graph. We provide applications of our main result in two domains, exact potential games and combinatorial auctions. First, we show that finding a pure Nash equilibrium in $2$-player $N$-action exact potential games requires polynomial (in $N$) communication. We also show that finding a pure Nash equilibrium in $n$-player $2$-action exact potential games requires exponential (in $n$) communication. The second domain that we consider is combinatorial auctions, in which we prove that finding a local maximum in combinatorial auctions requires exponential (in the number of items) communication even when the valuations are submodular. Each one of the results demonstrates an exponential separation between the non-deterministic communication complexity and the randomized communication complexity of a total search problem.
We study the following communication variant of local search. There is some fixed, commonly known graph $G$. Alice holds $f_A$ and Bob holds $f_B$, both are functions that specify a … We study the following communication variant of local search. There is some fixed, commonly known graph $G$. Alice holds $f_A$ and Bob holds $f_B$, both are functions that specify a value for each vertex. The goal is to find a local maximum of $f_A+f_B$ with respect to $G$, i.e., a vertex $v$ for which $(f_A+f_B)(v)\geq (f_A+f_B)(u)$ for every neighbor $u$ of $v$. Our main result is that finding a local maximum requires polynomial (in the number of vertices) bits of communication. The result holds for the following families of graphs: three dimensional grids, hypercubes, odd graphs, and degree 4 graphs. Moreover, we provide an \emph{optimal} communication bound of $\Omega(\sqrt{N})$ for the hypercube, and for a constant dimensional greed, where $N$ is the number of vertices in the graph. We provide applications of our main result in two domains, exact potential games and combinatorial auctions. First, we show that finding a pure Nash equilibrium in $2$-player $N$-action exact potential games requires polynomial (in $N$) communication. We also show that finding a pure Nash equilibrium in $n$-player $2$-action exact potential games requires exponential (in $n$) communication. The second domain that we consider is combinatorial auctions, in which we prove that finding a local maximum in combinatorial auctions requires exponential (in the number of items) communication even when the valuations are submodular. Each one of the results demonstrates an exponential separation between the non-deterministic communication complexity and the randomized communication complexity of a total search problem.
We study combinatorial auctions with bidders that exhibit endowment effect. In most of the previous work on cognitive biases in algorithmic game theory (e.g., [Kleinberg and Oren, EC'14] and its … We study combinatorial auctions with bidders that exhibit endowment effect. In most of the previous work on cognitive biases in algorithmic game theory (e.g., [Kleinberg and Oren, EC'14] and its follow-ups) the focus was on analyzing the implications and mitigating their negative consequences. In contrast, in this paper we show how in some cases cognitive biases can be harnessed to obtain better outcomes. Specifically, we study Walrasian equilibria in combinatorial markets. It is well known that Walrasian equilibria exist only in limited settings, e.g., when all valuations are gross substitutes, but fails to exist in more general settings, e.g., when the valuations are submodular. We consider combinatorial settings in which bidders exhibit the endowment effect, that is, their value for items increases with ownership. Our main result shows that when the valuations are submodular, even a mild degree of endowment effect is sufficient to guarantee the existence of Walrasian equilibria. In fact, we show that in contrast to Walrasian equilibria with standard utility maximizing bidders -- in which the equilibrium allocation must be efficient -- when bidders exhibit endowment effect any local optimum can be an equilibrium allocation. Our techniques reveal interesting connections between the LP relaxation of combinatorial auctions and local maxima. We also provide lower bounds on the intensity of the endowment effect that the bidders must have in order to guarantee the existence of a Walrasian equilibrium in various settings.
We analyze the revenue loss due to market shrinkage. Specifically, consider a simple market with one item for sale and $n$ bidders whose values are drawn from some joint distribution. … We analyze the revenue loss due to market shrinkage. Specifically, consider a simple market with one item for sale and $n$ bidders whose values are drawn from some joint distribution. Suppose that the market shrinks as a single bidder retires from the market. Suppose furthermore that the value of this retiring bidder is fixed and always strictly smaller than the values of the other players. We show that even this slight decrease in competition might cause a significant fall of a multiplicative factor of $\frac{1}{e+1}\approx0.268$ in the revenue that can be obtained by a dominant strategy ex-post individually rational mechanism. In particular, our results imply a solution to an open question that was posed by Dobzinski, Fu, and Kleinberg [STOC'11].
We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known … We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known cost function specifies the cost of providing every possible combination of services. A combinatorial cost sharing mechanism is a protocol that decides which services each player gets and at what price. We look for dominant strategy mechanisms that are (economically) efficient and cover the cost, ideally without overcharging (i.e., budget balanced). Note that unlike the standard cost sharing setting, combinatorial cost sharing is a multi-parameter domain. This makes designing dominant strategy mechanisms with good guarantees a challenging task.
We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known … We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known cost function specifies the cost of providing every possible combination of services. A combinatorial cost sharing mechanism is a protocol that decides which services each player gets and at what price. We look for dominant strategy mechanisms that are (economically) efficient and cover the cost, ideally without overcharging (i.e., budget balanced). Note that unlike the standard cost sharing setting, combinatorial cost sharing is a multi-parameter domain. This makes designing dominant strategy mechanisms with good guarantees a challenging task. We present the Potential Mechanism -- a combination of the VCG mechanism and a well-known tool from the theory of cooperative games: Hart and Mas-Colell's potential function. The potential mechanism is a dominant strategy mechanism that always covers the incurred cost. When the cost function is subadditive the same mechanism is also approximately efficient. Our main technical contribution shows that when the cost function is submodular the potential mechanism is approximately budget balanced in three settings: supermodular valuations, symmetric cost function and general symmetric valuations, and two players with general valuations.
We present fast algorithms for sketching valuation functions. Let N (| N | = n ) be some ground set and v :2 N → R be a function. We … We present fast algorithms for sketching valuation functions. Let N (| N | = n ) be some ground set and v :2 N → R be a function. We say that v˜:2 N → R is an α-sketch of v if for every set S we have that v ( S )/α ≤ v˜( S ) ≤ v ( S ) and v˜ can be described in poly ( n ) bits. Goemans et al. [SODA’09] showed that if v is submodular then there exists an õ (√ n )-sketch that can be constructed using polynomially many value queries (this is essentially the best possible, as Balcan and Harvey [STOC’11] show that no submodular function admits an n 1/3 - ϵ -sketch). Based on their work, Balcan et al. [COLT’12] and Badanidiyuru et al. [SODA’12] show that if v is subadditive, then there exists an õ (√ n )-sketch that can be constructed using polynomially many demand queries. All previous sketches are based on complicated geometric constructions. The first step in their constructions is proving the existence of a good sketch by finding an ellipsoid that “approximates” v well (this is done by applying John’s theorem to ensure the existence of an ellipsoid that is “close” to the polymatroid that is associated with v ). The second step is to show that this ellipsoid can be found efficiently, and this is done by repeatedly solving a certain convex program to obtain better approximations of John’s ellipsoid. In this article, we give a significantly simpler, nongeometric proof for the existence of good sketches and utilize the proof to obtain much faster algorithms that match the previously obtained approximation bounds. Specifically, we provide an algorithm that finds õ (√ n )-sketch of a submodular function with only õ ( n 3/2 value queries, and we provide an algorithm that finds õ (√ n )-sketch of a subadditive function with O ( n ) demand and value queries.
We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known … We introduce a combinatorial variant of the cost sharing problem: several services can be provided to each player and each player values every combination of services differently. A publicly known cost function specifies the cost of providing every possible combination of services. A combinatorial cost sharing mechanism is a protocol that decides which services each player gets and at what price. We look for dominant strategy mechanisms that are (economically) efficient and cover the cost, ideally without overcharging (i.e., budget balanced). Note that unlike the standard cost sharing setting, combinatorial cost sharing is a multi-parameter domain. This makes designing dominant strategy mechanisms with good guarantees a challenging task. We present the Potential Mechanism -- a combination of the VCG mechanism and a well-known tool from the theory of cooperative games: Hart and Mas-Colell's potential function. The potential mechanism is a dominant strategy mechanism that always covers the incurred cost. When the cost function is subadditive the same mechanism is also approximately efficient. Our main technical contribution shows that when the cost function is submodular the potential mechanism is approximately budget balanced in three settings: supermodular valuations, symmetric cost function and general symmetric valuations, and two players with general valuations.
We analyze the revenue loss due to market shrinkage. Specifically, consider a simple market with one item for sale and $n$ bidders whose values are drawn from some joint distribution. … We analyze the revenue loss due to market shrinkage. Specifically, consider a simple market with one item for sale and $n$ bidders whose values are drawn from some joint distribution. Suppose that the market shrinks as a single bidder retires from the market. Suppose furthermore that the value of this retiring bidder is fixed and always strictly smaller than the values of the other players. We show that even this slight decrease in competition might cause a significant fall of a multiplicative factor of $\frac{1}{e+1}\approx0.268$ in the revenue that can be obtained by a dominant strategy ex-post individually rational mechanism. In particular, our results imply a solution to an open question that was posed by Dobzinski, Fu, and Kleinberg [STOC'11].
We characterize the communication complexity of truthful mechanisms. Our departure point is the well known taxation principle. The taxation principle asserts that every truthful mechanism can be interpreted as follows: … We characterize the communication complexity of truthful mechanisms. Our departure point is the well known taxation principle. The taxation principle asserts that every truthful mechanism can be interpreted as follows: every player is presented with a menu that consists of a price for each bundle (the prices depend only on the valuations of the other players). Each player is allocated a bundle that maximizes his profit according to this menu. We define the taxation complexity of a truthful mechanism to be the logarithm of the maximum number of menus that may be presented to a player. Our main finding is that in general the taxation complexity essentially equals the communication complexity. The proof consists of two main steps. First, we prove that for rich enough domains the taxation complexity is at most the communication complexity. We then show that the taxation complexity is much smaller than the communication complexity only in "pathological" cases and provide a formal description of these extreme cases. Next, we study mechanisms that access the valuations via value queries only. In this setting we establish that the menu complexity - a notion that was already studied in several different contexts - characterizes the number of value queries that the mechanism makes in exactly the same way that the taxation complexity characterizes the communication complexity. Our approach yields several applications, including strengthening the solution concept with low communication overhead, fast computation of prices, and hardness of approximation by computationally efficient truthful mechanisms.
We study a central problem in Algorithmic Mechanism Design: constructing truthful mechanisms for welfare maximization in combinatorial auctions with submodular bidders. Dobzinski, Nisan, and Schapira provided the first mechanism that … We study a central problem in Algorithmic Mechanism Design: constructing truthful mechanisms for welfare maximization in combinatorial auctions with submodular bidders. Dobzinski, Nisan, and Schapira provided the first mechanism that guarantees a non-trivial approximation ratio of O(log^2 m) [STOC'06], where m is the number of items. This was subsequently improved to O( log m log log m) [Dobzinski, APPROX'07] and then to O(m) [Krysta and Vocking, ICALP'12]. In this paper we develop the first mechanism that breaks the logarithmic barrier. Specifically, the mechanism provides an approximation ratio of O( m). Similarly to previous constructions, our mechanism uses polynomially many value and demand queries, and in fact provides the same approximation ratio for the larger class of XOS (a.k.a. fractionally subadditive) valuations. We also develop a computationally efficient implementation of the mechanism for combinatorial auctions with budget additive bidders. Although in general computing a demand query is NP-hard for budget additive valuations, we observe that the specific form of demand queries that our mechanism uses can be efficiently computed when bidders are budget additive.
We study the bilateral trade problem: one seller, one buyer and a single, indivisible item for sale. It is well known that there is no fully-efficient and incentive compatible mechanism … We study the bilateral trade problem: one seller, one buyer and a single, indivisible item for sale. It is well known that there is no fully-efficient and incentive compatible mechanism for this problem that maintains a balanced budget. We design simple and robust mechanisms that obtain approximate efficiency with these properties. We show that even minimal use of statistical data can yield good approximation results. Finally, we demonstrate how a mechanism for this simple bilateral-trade problem can be used as a black-box for constructing mechanisms in more general environments.
We characterize the communication complexity of truthful mechanisms. Our departure point is the well known taxation principle. The taxation principle asserts that every truthful mechanism can be interpreted as follows: … We characterize the communication complexity of truthful mechanisms. Our departure point is the well known taxation principle. The taxation principle asserts that every truthful mechanism can be interpreted as follows: every player is presented with a menu that consists of a price for each bundle (the prices depend only on the valuations of the other players). Each player is allocated a bundle that maximizes his profit according to this menu. We define the taxation complexity of a truthful mechanism to be the logarithm of the maximum number of menus that may be presented to a player. Our main finding is that in general the taxation complexity essentially equals the communication complexity. The proof consists of two main steps. First, we prove that for rich enough domains the taxation complexity is at most the communication complexity. We then show that the taxation complexity is much smaller than the communication complexity only in pathological cases and provide a formal description of these extreme cases. Next, we study mechanisms that access the valuations via value queries only. In this setting we establish that the menu complexity -- a notion that was already studied in several different contexts -- characterizes the number of value queries that the mechanism makes in exactly the same way that the taxation complexity characterizes the communication complexity. Our approach yields several applications, including strengthening the solution concept with low communication overhead, fast computation of prices, and hardness of approximation by computationally efficient truthful mechanisms.
We study a central problem in Algorithmic Mechanism Design: constructing truthful mechanisms for welfare maximization in combinatorial auctions with submodular bidders. Dobzinski, Nisan, and Schapira provided the first mechanism that … We study a central problem in Algorithmic Mechanism Design: constructing truthful mechanisms for welfare maximization in combinatorial auctions with submodular bidders. Dobzinski, Nisan, and Schapira provided the first mechanism that guarantees a non-trivial approximation ratio of $O(\log^2 m)$ [STOC'06], where $m$ is the number of items. This was subsequently improved to $O(\log m\log \log m)$ [Dobzinski, APPROX'07] and then to $O(\log m)$ [Krysta and Vocking, ICALP'12]. In this paper we develop the first mechanism that breaks the logarithmic barrier. Specifically, the mechanism provides an approximation ratio of $O(\sqrt {\log m})$. Similarly to previous constructions, our mechanism uses polynomially many value and demand queries, and in fact provides the same approximation ratio for the larger class of XOS (a.k.a. fractionally subadditive) valuations. We also develop a computationally efficient implementation of the mechanism for combinatorial auctions with budget additive bidders. Although in general computing a demand query is NP-hard for budget additive valuations, we observe that the specific form of demand queries that our mechanism uses can be efficiently computed when bidders are budget additive.
We show that every universally truthful randomized mechanism for combinatorial auctions with submodular valuations that provides an approximation ratio of m1/ 2 -ε must use exponentially many value queries, where … We show that every universally truthful randomized mechanism for combinatorial auctions with submodular valuations that provides an approximation ratio of m1/ 2 -ε must use exponentially many value queries, where m is the number of items. In contrast, ignoring incentives there exist constant ratio approximation algorithms for this problem. Our approach is based on a novel direct hardness technique that completely skips the notoriously hard step of characterizing truthful mechanisms. The characterization step was the main obstacle for proving impossibility results in algorithmic mechanism design so far. We demonstrate two additional applications of our new technique: (1) an impossibility result for universally-truthful polynomial time flexible combinatorial public projects and (2) an impossibility result for truthful-in-expectation mechanisms for exact combinatorial public projects. The latter is the first result that bounds the power of polynomial-time truthful in expectation mechanisms in any setting.
One of the fundamental questions of Algorithmic Mechanism Design is whether there exists an inherent clash between truthfulness and computational tractability: in particular, whether polynomial-time truthful mechanisms for combinatorial auctions … One of the fundamental questions of Algorithmic Mechanism Design is whether there exists an inherent clash between truthfulness and computational tractability: in particular, whether polynomial-time truthful mechanisms for combinatorial auctions are provably weaker in terms of approximation ratio than non-truthful ones. This question was very recently answered for universally truthful mechanisms for combinatorial auctions [4], and even for truthful-in-expectation mechanisms [12]. However, both of these results are based on information-theoretic arguments for valuations given by a value oracle, and leave open the possibility of polynomial-time truthful mechanisms for succinctly described classes of valuations.
We consider reallocation problems in settings where the initial endowment of each agent consists of a subset of the resources. The private information of the players is their value for … We consider reallocation problems in settings where the initial endowment of each agent consists of a subset of the resources. The private information of the players is their value for every possible subset of the resources. The goal is to redistribute resources among agents to maximize efficiency. Monetary transfers are allowed, but participation is voluntary.
The design of computationally efficient and incentive compatible mechanisms that solve or approximate fundamental resource allocation problems is the main goal of algorithmic mechanism design. A central example in both … The design of computationally efficient and incentive compatible mechanisms that solve or approximate fundamental resource allocation problems is the main goal of algorithmic mechanism design. A central example in both theory and practice is welfare-maximization in combinatorial auctions. Recently, a randomized mechanism has been discovered for combinatorial auctions that is truthful in expectation and guarantees a (1-1/e)-approximation to the optimal social welfare when players have coverage valuations [DRY11]. This approximation ratio is the best possible even for non-truthful algorithms, assuming P does not equal NP. Given the recent sequence of negative results for combinatorial auctions under more restrictive notions of incentive compatibility, this development raises a natural question: Are truthful-in-expectation mechanisms compatible with polynomial-time approximation in a way that deterministic or universally truthful mechanisms are not? In particular, can polynomial-time truthful-in-expectation mechanisms guarantee a near-optimal approximation ratio for more general variants of combinatorial auctions? We prove that this is not the case. Specifically, the result of [DRY11] cannot be extended to combinatorial auctions with sub modular valuations in the value oracle model. (Absent strategic considerations, a (1-1/e)-approximation is still achievable in this setting.) More precisely, we prove that there is a constant \gamma>0 such that there is no randomized mechanism that is truthful-in-expectation -- or even approximately truthful-in-expectation -- and guarantees an m^{-\gamma}-approximation to the optimal social welfare for combinatorial auctions with sub modular valuations in the value oracle model. We also prove an analogous result for the flexible combinatorial public projects (CPP) problem, where a truthful-in-expectation $(1-1/e)$-approximation for coverage valuations has been recently developed [Dughmi11]. We show that there is no truthful-in-expectation -- or even approximately truthful-in-expectation -- mechanism that achieves an m^{-\gamma}-approximation to the optimal social welfare for combinatorial public projects with sub modular valuations in the value oracle model. Both our results present an unexpected separation between coverage functions and sub modular functions, which does not occur for these problems without strategic considerations.
Algorithmic mechanism design (AMD) studies the delicate interplay between computational efficiency, truthfulness, and optimality. We focus on AMD's paradigmatic problem: combinatorial auctions. We present a new generalization of the VC … Algorithmic mechanism design (AMD) studies the delicate interplay between computational efficiency, truthfulness, and optimality. We focus on AMD's paradigmatic problem: combinatorial auctions. We present a new generalization of the VC dimension to multivalued collections of functions, which encompasses the classical VC dimension, Natarajan dimension, and Steele dimension. We present a corresponding generalization of the Sauer-Shelah Lemma and harness this VC machinery to establish inapproximability results for deterministic truthful mechanisms. Our results essentially unify all inapproximability results for deterministic truthful mechanisms for combinatorial auctions to date and establish new separation gaps between truthful and non-truthful algorithms.
We characterize the communication complexity of truthful mechanisms. Our departure point is the well known taxation principle. The taxation principle asserts that every truthful mechanism can be interpreted as follows: … We characterize the communication complexity of truthful mechanisms. Our departure point is the well known taxation principle. The taxation principle asserts that every truthful mechanism can be interpreted as follows: every player is presented with a menu that consists of a price for each bundle (the prices depend only on the valuations of the other players). Each player is allocated a bundle that maximizes his profit according to this menu. We define the taxation complexity of a truthful mechanism to be the logarithm of the maximum number of menus that may be presented to a player. Our main finding is that in general the taxation complexity essentially equals the communication complexity. The proof consists of two main steps. First, we prove that for rich enough domains the taxation complexity is at most the communication complexity. We then show that the taxation complexity is much smaller than the communication complexity only in "pathological" cases and provide a formal description of these extreme cases. Next, we study mechanisms that access the valuations via value queries only. In this setting we establish that the menu complexity - a notion that was already studied in several different contexts - characterizes the number of value queries that the mechanism makes in exactly the same way that the taxation complexity characterizes the communication complexity. Our approach yields several applications, including strengthening the solution concept with low communication overhead, fast computation of prices, and hardness of approximation by computationally efficient truthful mechanisms.
In many settings the power of truthful mechanisms is severely bounded. In this paper we use randomization to overcome this problem. In particular, we construct an FPTAS for multi-unit auctions … In many settings the power of truthful mechanisms is severely bounded. In this paper we use randomization to overcome this problem. In particular, we construct an FPTAS for multi-unit auctions that is truthful in expectation, whereas there is evidence that no polynomial-time truthful deterministic mechanism provides an approximation ratio better than 2. We also show for the first time that truthful in expectation polynomial-time mechanisms are \emph{provably} stronger than polynomial-time universally truthful mechanisms. Specifically, we show that there is a setting in which: (1) there is a non-polynomial time truthful mechanism that always outputs the optimal solution, and that (2) no universally truthful randomized mechanism can provide an approximation ratio better than 2 in polynomial time, but (3) an FPTAS that is truthful in expectation exists.
Submodular function maximization is a central problem in combinatorial optimization, generalizing many important problems including Max Cut in directed/undirected graphs and in hypergraphs, certain constraint satisfaction problems, maximum entropy sampling, … Submodular function maximization is a central problem in combinatorial optimization, generalizing many important problems including Max Cut in directed/undirected graphs and in hypergraphs, certain constraint satisfaction problems, maximum entropy sampling, and maximum facility location problems. Unlike submodular minimization, submodular maximization is NP-hard. In this paper, we give the first constant-factor approximation algorithm for maximizing any non-negative submodular function subject to multiple matroid or knapsack constraints. We emphasize that our results are for non-monotone submodular functions. In particular, for any constant k, we present a (1/k+2+1/k+ε)-approximation for the submodular maximization problem under k matroid constraints, and a (1/5-ε)-approximation algorithm for this problem subject to k knapsack constraints (ε>0 is any constant). We improve the approximation guarantee of our algorithm to 1/k+1+{1/k-1}+ε for k≥2 partition matroid constraints. This idea also gives a ({1/k+ε)-approximation for maximizing a monotone submodular function subject to k≥2 partition matroids, which improves over the previously best known guarantee of 1/k+1.
Recent work has considered theoretical models for the behavior of agents with specific behavioral biases: rather than making decisions that optimize a given payoff function, the agent behaves inefficiently because … Recent work has considered theoretical models for the behavior of agents with specific behavioral biases: rather than making decisions that optimize a given payoff function, the agent behaves inefficiently because its decisions suffer from an underlying bias. These approaches have generally considered an agent who experiences a single behavioral bias, studying the effect of this bias on the outcome. In general, however, decision-making can and will be affected by multiple biases operating at the same time. How do multiple biases interact to produce the overall outcome? Here we consider decisions in the presence of a pair of biases exhibiting an intuitively natural interaction: present bias -- the tendency to value costs incurred in the present too highly -- and sunk-cost bias -- the tendency to incorporate costs experienced in the past into one's plans for the future.
We prove the first separation in the approximation guarantee achievable by truthful and non-truthful combinatorial auctions with polynomial communication. Specifically, we prove that any truthful auction guaranteeing a (34−1240+є)-approximation for … We prove the first separation in the approximation guarantee achievable by truthful and non-truthful combinatorial auctions with polynomial communication. Specifically, we prove that any truthful auction guaranteeing a (34−1240+є)-approximation for two buyers with XOS valuations over m items requires exp(Ω(ε2 · m)) communication whereas a non-truthful auction by Feige [J. Comput. 2009] is already known to achieve a 34-approximation in (m) communication.
A longstanding open problem in Algorithmic Mechanism Design is to design computationally-efficient truthful mechanisms for (approximately) maximizing welfare in combinatorial auctions with submodular bidders. The first such mechanism was obtained … A longstanding open problem in Algorithmic Mechanism Design is to design computationally-efficient truthful mechanisms for (approximately) maximizing welfare in combinatorial auctions with submodular bidders. The first such mechanism was obtained by Dobzinski, Nisan, and Schapira [STOC'06] who gave an O(log <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">2</sup> m)-approximation where m is number of items. This problem has been studied extensively since, culminating in an O(√log m)-approximation mechanism by Dobzinski [STOC'16]. We present a computationally-efficient truthful mechanism with approximation ratio that improves upon the state-of-the-art by an exponential factor. In particular, our mechanism achieves an O((log log m) <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">3</sup> )-approximation in expectation, uses only O(n) demand queries, and has universal truthfulness whether Θ(√log m) is the best approximation ratio in this guarantee. This settles an open question of Dobzinski on setting in negative.
We study a novel class of mechanism design problems in which the outcomes are constrained by the payments. This basic class of mechanism design problems captures many common economic situations, … We study a novel class of mechanism design problems in which the outcomes are constrained by the payments. This basic class of mechanism design problems captures many common economic situations, and yet it has not been studied, to our knowledge, in the past. We focus on the case of procurement auctions in which sellers have private costs, and the auctioneer aims to maximize a utility function on subsets of items, under the constraint that the sum of the payments provided by the mechanism does not exceed a given budget. Standard mechanism design ideas such as the VCG mechanism and its variants are not applicable here. We show that, for general functions, the budget constraint can render mechanisms arbitrarily bad in terms of the utility of the buyer. However, our main result shows that for the important class of sub modular functions, a bounded approximation ratio is achievable. Better approximation results are obtained for subclasses of the sub modular functions. We explore the space of budget feasible mechanisms in other domains and give a characterization under more restricted conditions.
We study combinatorial procurement auctions, where a buyer with a valuation function v and budget B wishes to buy a set of items. Each item i has a cost ci … We study combinatorial procurement auctions, where a buyer with a valuation function v and budget B wishes to buy a set of items. Each item i has a cost ci and the buyer is interested in a set S that maximizes v(S) subject to ∑i∈Sci ≤ β. Special cases of combinatorial procurement auctions are well-studied problems from submodular optimization. In particular, when the costs are all equal (cardinality constraint), a classic result by Nemhauser et al shows that the greedy algorithm provides an e/e-1 approximation.
Simultaneous item auctions are simple and practical procedures for allocating items to bidders with potentially complex preferences. In a simultaneous auction, every bidder submits independent bids on all items simultaneously. … Simultaneous item auctions are simple and practical procedures for allocating items to bidders with potentially complex preferences. In a simultaneous auction, every bidder submits independent bids on all items simultaneously. The allocation and prices are then resolved for each item separately, based solely on the bids submitted on that item. We study the efficiency of Bayes-Nash equilibrium (BNE) outcomes of simultaneous first- and second-price auctions when bidders have complement-free (a.k.a. subadditive) valuations. While it is known that the social welfare of every pure Nash equilibrium (NE) constitutes a constant fraction of the optimal social welfare, a pure NE rarely exists, and moreover, the full information assumption is often unrealistic. Therefore, quantifying the welfare loss in Bayes-Nash equilibria is of particular interest. Previous work established a logarithmic bound on the ratio between the social welfare of a BNE and the expected optimal social welfare in both first-price auctions (Hassidim et al., 2011) and second-price auctions (Bhawalkar and Roughgarden, 2011), leaving a large gap between a constant and a logarithmic ratio. We introduce a new proof technique and use it to resolve both of these gaps in a unified way. Specifically, we show that the expected social welfare of any BNE is at least 1/2 of the optimal social welfare in the case of first-price auctions, and at least 1/4 in the case of second-price auctions.
We consider the problem of maximizing a non-negative submodular set function f:2N -> RR+ over a ground set N subject to a variety of packing type constraints including (multiple) matroid … We consider the problem of maximizing a non-negative submodular set function f:2N -> RR+ over a ground set N subject to a variety of packing type constraints including (multiple) matroid constraints, knapsack constraints, and their intersections. In this paper we develop a general framework that allows us to derive a number of new results, in particular when f may be a non-monotone function. Our algorithms are based on (approximately) solving the multilinear extension F of f [5] over a polytope P that represents the constraints, and then effectively rounding the fractional solution. Although this approach has been used quite successfully in some settings [6, 22, 24, 13, 3], it has been limited in some important ways. We overcome these limitations as follows.
We consider prior-free auctions for revenue and welfare maximization when agents have a common budget. The abstract environments we consider are ones where there is a downward-closed and symmetric feasibility … We consider prior-free auctions for revenue and welfare maximization when agents have a common budget. The abstract environments we consider are ones where there is a downward-closed and symmetric feasibility constraint on the probabilities of service of the agents. These environments include position auctions where slots with decreasing click-through rates are auctioned to advertisers. We generalize and characterize the envy-free benchmark from Hartline and Yan [2011] to settings with budgets and characterize the optimal envy-free outcomes for both welfare and revenue. We give prior-free mechanisms that approximate these benchmarks. A building block in our mechanism is a clinching auction for position auction environments. This auction is a generalization of the multi-unit clinching auction of Dobzinski et al. [2008] and a special case of the polyhedral clinching auction of Goel et al. [2012]. For welfare maximization, we show that this clinching auction is a good approximation to the envy-free optimal welfare for position auction environments. For profit maximization, we generalize the random sampling profit extraction auction from Fiat et al. [2002] for digital goods to give a 10.0-approximation to the envy-free optimal revenue in symmetric, downward-closed environments. Even without budgets this revenue maximization question is of interest and we obtain an improved approximation bound of 7.5 (from 30.4 by Ha and Hartline [2012]).
A sequence of recent studies show that even in the simple setting of a single seller and a single buyer with additive, independent valuations over m items, the revenue-maximizing mechanism … A sequence of recent studies show that even in the simple setting of a single seller and a single buyer with additive, independent valuations over m items, the revenue-maximizing mechanism is prohibitively complex. This problem has been addressed using two main approaches: Approximation: the best of two simple mechanisms (sell each item separately, or sell all the items as one bundle) gives 1/6 of the optimal revenue [1]. Enhanced competition: running the simple VCG mechanism with additional m buyers extracts at least the optimal revenue in the original market [17]. Both approaches, however, suffer from severe drawbacks: On the one hand, losing 83% of the revenue is hardly acceptable in any application. On the other hand, attracting a linear number of new buyers may be prohibitive. We show that by combining the two approaches one can achieve the best of both worlds. Specifically, for any constant ε one can obtain a (1-ε) fraction of the optimal revenue by running simple mechanisms --- either selling each item separately or selling all items as a single bundle --- with substantially fewer additional buyers: logarithmic, constant, or even none in some cases.
We consider the following communication problem: Alice and Bob each have some valuation functions υ1(·) and υ2(·) over subsets of m items, and their goal is to partition the items … We consider the following communication problem: Alice and Bob each have some valuation functions υ1(·) and υ2(·) over subsets of m items, and their goal is to partition the items into S, in a way that maximizes the welfare, . We study both the allocation problem, which asks for a welfare-maximizing partition and the decision problem, which asks whether or not there exists a partition guaranteeing certain welfare, for binary XOS valuations. For interactive protocols with poly(m) communication, a tight 3/4-approximation is known for both [29, 23].For interactive protocols, the allocation problem is provably harder than the decision problem: any solution to the allocation problem implies a solution to the decision problem with one additional round and log m additional bits of communication via a trivial reduction. Surprisingly, the allocation problem is provably easier for simultaneous protocols. Specifically, we show:•There exists a simultaneous, randomized protocol with polynomial communication that selects a partition whose expected welfare is at least 3/4 of the optimum. This matches the guarantee of the best interactive, randomized protocol with polynomial communication.•For all ε > 0, any simultaneous, randomized protocol that decides whether the welfare of the optimal partition is ≥ 1 or ≤ 3/4 – 1/108 + ε correctly with probability > 1/2 + 1/poly(m) requires exponential communication. This provides a separation between the attainable approximation guarantees via interactive (3/4) versus simultaneous (≤ 3/4 – 1/108) protocols with polynomial communication.In other words, this trivial reduction from decision to allocation problems provably requires the extra round of communication. We further discuss the implications of our results for the design of truthful combinatorial auctions in general, and extensions to general XOS valuations. In particular, our protocol for the allocation problem implies a new style of truthful mechanisms.
We study the problem of allocating a set of indivisible goods among a set of agents in a fair and efficient manner. An allocation is said to be fair if … We study the problem of allocating a set of indivisible goods among a set of agents in a fair and efficient manner. An allocation is said to be fair if it is envy-free up to one good (EF1), which means that each agent prefers its own bundle over the bundle of any other agent up to the removal of one good. In addition, an allocation is deemed efficient if it satisfies Pareto efficiency. While each of these well-studied properties is easy to achieve separately, achieving them together is far from obvious. Recently, Caragiannis et al. (2016) established the surprising result that when agents have additive valuations for the goods, there always exists an allocation that simultaneously satisfies these two seemingly incompatible properties. Specifically, they showed that an allocation that maximizes the Nash social welfare objective is both EF1 and Pareto efficient. However, the problem of maximizing Nash social welfare is NP-hard. As a result, this approach does not provide an efficient algorithm for finding a fair and efficient allocation. In this paper, we bypass this barrier, and develop a pseudopolynomial time algorithm for finding allocations that are EF1 and Pareto efficient; in particular, when the valuations are bounded, our algorithm finds such an allocation in polynomial time. Furthermore, we establish a stronger existence result compared to Caragiannis et al. (2016): For additive valuations, there always exists an allocation that is EF1 and fractionally Pareto efficient. Another key contribution of our work is to show that our algorithm provides a polynomial-time 1.45-approximation to the Nash social welfare objective. This improves upon the best known approximation ratio for this problem (namely, the 2-approximation algorithm of Cole et al., 2017), and also matches the lower bound on the integrality gap of the convex program of Cole et al. (2017). Unlike many of the existing approaches, our algorithm is completely combinatorial, and relies on constructing integral Fisher markets wherein specific equilibria are not only efficient, but also fair.
Present bias, the tendency to weigh costs and benefits incurred in the present too heavily, is one of the most widespread human behavioral biases. It has also been the subject … Present bias, the tendency to weigh costs and benefits incurred in the present too heavily, is one of the most widespread human behavioral biases. It has also been the subject of extensive study in the behavioral economics literature. While the simplest models assume that decision-making agents are naive, reasoning about the future without taking their bias into account, there is considerable evidence that people often behave in ways that are sophisticated with respect to present bias, making plans based on the belief that they will be present-biased in the future. For example, committing to a course of action to reduce future opportunities for procrastination or overconsumption are instances of sophisticated behavior in everyday life.
We study combinatorial auctions with bidders that exhibit endowment effect. In most of the previous work on cognitive biases in algorithmic game theory (e.g., [Kleinberg and Oren, EC'14] and its … We study combinatorial auctions with bidders that exhibit endowment effect. In most of the previous work on cognitive biases in algorithmic game theory (e.g., [Kleinberg and Oren, EC'14] and its follow-ups) the focus was on analyzing the implications and mitigating their negative consequences. In contrast, in this paper we show how in some cases cognitive biases can be harnessed to obtain better outcomes. Specifically, we study Walrasian equilibria in combinatorial markets. It is well known that Walrasian equilibria exist only in limited settings, e.g., when all valuations are gross substitutes, but fails to exist in more general settings, e.g., when the valuations are submodular. We consider combinatorial settings in which bidders exhibit the endowment effect, that is, their value for items increases with ownership. Our main result shows that when the valuations are submodular, even a mild degree of endowment effect is sufficient to guarantee the existence of Walrasian equilibria. In fact, we show that in contrast to Walrasian equilibria with standard utility maximizing bidders -- in which the equilibrium allocation must be efficient -- when bidders exhibit endowment effect any local optimum can be an equilibrium allocation. Our techniques reveal interesting connections between the LP relaxation of combinatorial auctions and local maxima. We also provide lower bounds on the intensity of the endowment effect that the bidders must have in order to guarantee the existence of a Walrasian equilibrium in various settings.
A major achievement of mechanism design theory is a general method for the construction of truthful mechanisms called VCG (Vickrey, Clarke, Groves). When applying this method to complex problems such … A major achievement of mechanism design theory is a general method for the construction of truthful mechanisms called VCG (Vickrey, Clarke, Groves). When applying this method to complex problems such as combinatorial auctions, a difficulty arises: VCG mechanisms are required to compute optimal outcomes and are, therefore, computationally infeasible. However, if the optimal outcome is replaced by the results of a sub-optimal algorithm, the resulting mechanism (termed VCG-based) is no longer necessarily truthful. The first part of this paper studies this phenomenon in depth and shows that it is near universal. Specifically, we prove that essentially all reasonable approximations or heuristics for combinatorial auctions as well as a wide class of cost minimization problems yield non-truthful VCG-based mechanisms. We generalize these results for affine maximizers. The second part of this paper proposes a general method for circumventing the above problem. We introduce a modification of VCG-based mechanisms in which the agents are given a chance to improve the output of the underlying algorithm. When the agents behave truthfully, the welfare obtained by the mechanism is at least as good as the one obtained by the algorithm's output. We provide a strong rationale for truth-telling behavior. Our method satisfies individual rationality as well.
We study the design of Bayesian incentive compatible mechanisms in single parameter domains, for the objective of optimizing social efficiency as measured by social cost. In the problems we consider, … We study the design of Bayesian incentive compatible mechanisms in single parameter domains, for the objective of optimizing social efficiency as measured by social cost. In the problems we consider, a group of participants compete to receive service from a mechanism that can provide such services at a cost. The mechanism wishes to choose which agents to serve in order to maximize social efficiency, but is not willing to suffer an expected loss: the agents' payments should cover the cost of service in expectation.
Submodular function maximization is a central problem in combinatorial optimization, generalizing many important problems including Max Cut in directed/undirected graphs and in hypergraphs, certain constraint satisfaction problems, maximum entropy sampling, … Submodular function maximization is a central problem in combinatorial optimization, generalizing many important problems including Max Cut in directed/undirected graphs and in hypergraphs, certain constraint satisfaction problems, maximum entropy sampling, and maximum facility location problems. Unlike submodular minimization, submodular maximization is NP-hard. For the problem of maximizing a non-monotone submodular function, Feige, Mirrokni, and Vondrák recently developed a $2\over 5$-approximation algorithm \cite{FMV07}, however, their algorithms do not handle side constraints.} In this paper, we give the first constant-factor approximation algorithm for maximizing any non-negative submodular function subject to multiple matroid or knapsack constraints. We emphasize that our results are for {\em non-monotone} submodular functions. In particular, for any constant $k$, we present a $({1\over k+2+{1\over k}+ε})$-approximation for the submodular maximization problem under $k$ matroid constraints, and a $({1\over 5}-ε)$-approximation algorithm for this problem subject to $k$ knapsack constraints ($ε&gt;0$ is any constant). We improve the approximation guarantee of our algorithm to ${1\over k+1+{1\over k-1}+ε}$ for $k\ge 2$ partition matroid constraints. This idea also gives a $({1\over k+ε})$-approximation for maximizing a {\em monotone} submodular function subject to $k\ge 2$ partition matroids, which improves over the previously best known guarantee of $\frac{1}{k+1}$.
In this paper we present improved bounds for approximating maximum matchings in bipartite graphs in the streaming model. First, we consider the question of how well maximum matching can be … In this paper we present improved bounds for approximating maximum matchings in bipartite graphs in the streaming model. First, we consider the question of how well maximum matching can be approximated in a single pass over the input when Õ(n) space is allowed, where n is the number of vertices in the input graph. Two natural variants of this problem have been considered in the literature: (1) the edge arrival setting, where edges arrive in the stream and (2) the vertex arrival setting, where vertices on one side of the graph arrive in the stream together with all their incident edges. The latter setting has also been studied extensively in the context of online algorithms, where each arriving vertex has to either be matched irrevocably or discarded upon arrival. In the online setting, the celebrated algorithm of Karp-Vazirani-Vazirani achieves a 1 − 1/e approximation by crucially using randomization (and using Õ(n) space). Despite the fact that the streaming model is less restrictive in that the algorithm is not constrained to match vertices irrevocably upon arrival, the best known approximation in the streaming model with vertex arrivals and Õ(n) space is the same factor of 1 − 1/e.
Budget feasible mechanism design studies procurement combinatorial auctions in the sellers have private costs to produce items, and the buyer (auctioneer) aims to maximize a social valuation function on subsets … Budget feasible mechanism design studies procurement combinatorial auctions in the sellers have private costs to produce items, and the buyer (auctioneer) aims to maximize a social valuation function on subsets of items, under the budget constraint on the total payment. One of the most important questions in the field is which valuation domains admit truthful budget feasible mechanisms with 'small' approximations (compared to the social optimum)? Singer [35] showed that additive and submodular functions have a constant approximation mechanism. Recently, Dobzinski, Papadimitriou, and Singer [20] gave an O(log2n) approximation mechanism for subadditive functions; further, they remarked that: A fundamental question is whether, regardless of computational constraints, a constant-factor budget feasible mechanism exists for subadditive In this paper, we address this question from two viewpoints: prior-free worst case analysis and Bayesian analysis, are two standard approaches from computer science and economics, respectively. - For the prior-free framework, we use a linear program (LP) that describes the fractional cover of the valuation function; the LP is also connected to the concept of approximate core in cooperative game theory. We provide a mechanism for subadditive functions whose approximation is O(I), via the worst case integrality gap I of this LP. This implies an O(log n)-approximation for subadditive valuations, O(1)-approximation for XOS valuations, as well as for valuations having a constant integrality gap. XOS valuations are an important class of functions and lie between the submodular and the subadditive classes of valuations. We further give another polynomial time O(log n/(log log n)) sub-logarithmic approximation mechanism for subadditive functions. Both of our mechanisms improve the best known approximation ratio O(log2 n). - For the Bayesian framework, we provide a constant approximation mechanism for all subadditive functions, using the above prior-free mechanism for XOS valuations as a subroutine. Our mechanism allows correlations in the distribution of private information and is universally truthful.
Time-inconsistency refers to a paradox in decision making where agents exhibit inconsistent behaviors over time. Examples are procrastination where agents tend to postpone easy tasks, and abandonments where agents start … Time-inconsistency refers to a paradox in decision making where agents exhibit inconsistent behaviors over time. Examples are procrastination where agents tend to postpone easy tasks, and abandonments where agents start a plan and quit in the middle. To capture such behaviors and to quantify inefficiency caused by such behaviors, Kleinberg and Oren (2014) propose a graph model with a certain cost structure and initiate the study of several interesting computation problems: 1) cost ratio: the worst ratio between the actual cost of the agent and the optimal cost, over all the graph instances; 2) motivating subgraph: how to motivate the agent to reach the goal by deleting nodes and edges; 3) Intermediate rewards: how to incentivize agents to reach the goal by placing intermediate rewards. Kleinberg and Oren give partial answers to these questions, but the main problems are open. In this paper, we give answers to all three open problems. First, we show a tight upper bound of cost ratio for graphs, and confirm the conjecture by Kleinberg and Oren that Akerlof’s structure is indeed the worst case for cost ratio. Second, we prove that finding a motivating subgraph is NP-hard, showing that it is generally inefficient to motivate agents by deleting nodes and edges in the graph. Last but not least, we show that computing a strategy to place minimum amount of total reward is also NP-hard and we provide a 2n- approximation algorithm.
We investigate the approximability of several classes of real-valued functions by functions of a small number of variables ({\em juntas}). Our main results are tight bounds on the number of … We investigate the approximability of several classes of real-valued functions by functions of a small number of variables ({\em juntas}). Our main results are tight bounds on the number of variables required to approximate a function $f:\{0,1\}^n \rightarrow [0,1]$ within $\ell_2$-error $\epsilon$ over the uniform distribution: 1. If $f$ is submodular, then it is $\epsilon$-close to a function of $O(\frac{1}{\epsilon^2} \log \frac{1}{\epsilon})$ variables. This is an exponential improvement over previously known results. We note that $\Omega(\frac{1}{\epsilon^2})$ variables are necessary even for linear functions. 2. If $f$ is fractionally subadditive (XOS) it is $\epsilon$-close to a function of $2^{O(1/\epsilon^2)}$ variables. This result holds for all functions with low total $\ell_1$-influence and is a real-valued analogue of Friedgut's theorem for boolean functions. We show that $2^{\Omega(1/\epsilon)}$ variables are necessary even for XOS functions. As applications of these results, we provide learning algorithms over the uniform distribution. For XOS functions, we give a PAC learning algorithm that runs in time $2^{poly(1/\epsilon)} poly(n)$. For submodular functions we give an algorithm in the more demanding PMAC learning model (Balcan and Harvey, 2011) which requires a multiplicative $1+\gamma$ factor approximation with probability at least $1-\epsilon$ over the target distribution. Our uniform distribution algorithm runs in time $2^{poly(1/(\gamma\epsilon))} poly(n)$. This is the first algorithm in the PMAC model that over the uniform distribution can achieve a constant approximation factor arbitrarily close to 1 for all submodular functions. As follows from the lower bounds in (Feldman et al., 2013) both of these algorithms are close to optimal. We also give applications for proper learning, testing and agnostic learning with value queries of these classes.
We design simple mechanisms to approximate the Gains from Trade (GFT) in two-sided markets with multiple unit-supply sellers and multiple unit-demand buyers. A classical impossibility result by Myerson and Satterthwaite … We design simple mechanisms to approximate the Gains from Trade (GFT) in two-sided markets with multiple unit-supply sellers and multiple unit-demand buyers. A classical impossibility result by Myerson and Satterthwaite showed that even with only one seller and one buyer, no Individually Rational (IR), Bayesian Incentive Compatible (BIC) and Budget-Balanced (BB) mechanism can achieve full GFT (trade whenever buyer's value is higher than the seller's cost). On the other hand, they proposed the "second-best" mechanism that maximizes the GFT subject to IR, BIC and BB constraints, which is unfortunately rather complex for even the single-seller single-buyer case. Our mechanism is simple, IR, BIC and BB, and achieves $\frac{1}{2}$ of the optimal GFT among all IR, BIC and BB mechanisms. Our result holds for arbitrary distributions of the buyers' and sellers' values and can accommodate any downward-closed feasibility constraints over the allocations. The analysis of our mechanism is facilitated by extending the Cai-Weinberg-Devanur duality framework to two-sided markets.
Previous chapter Next chapter Full AccessProceedings Proceedings of the 2011 Annual ACM-SIAM Symposium on Discrete Algorithms (SODA)Submodular Maximization by Simulated AnnealingShayan Oveis Gharan and Jan VondrákShayan Oveis Gharan and Jan … Previous chapter Next chapter Full AccessProceedings Proceedings of the 2011 Annual ACM-SIAM Symposium on Discrete Algorithms (SODA)Submodular Maximization by Simulated AnnealingShayan Oveis Gharan and Jan VondrákShayan Oveis Gharan and Jan Vondrákpp.1098 - 1116Chapter DOI:https://doi.org/10.1137/1.9781611973082.83PDFBibTexSections ToolsAdd to favoritesExport CitationTrack CitationsEmail SectionsAboutAbstract We consider the problem of maximizing a non-negative (possibly non-monotone) submodular set function with or without constraints. Feige et al. [9] showed a 2/5-approximation for the unconstrained problem and also proved that no approximation better than 1/2 is possible in the value oracle model. Constant-factor approximation has been also known for submodular maximization subject to a matroid independence constraint (a factor of 0.309 [33]) and for submodular maximization subject to a matroid base constraint, provided that the fractional base packing number v is bounded away from 1 (a 1/4-approximation assuming that v ≥ 2 [33]). In this paper, we propose a new algorithm for submodular maximization which is based on the idea of simulated annealing. We prove that this algorithm achieves improved approximation for two problems: a 0.41-approximation for unconstrained submodular maximization, and a 0.325-approximation for submodular maximization subject to a matroid independence constraint. On the hardness side, we show that in the value oracle model it is impossible to achieve a 0.478-approximation for submodular maximization subject to a matroid independence constraint, or a 0.394-approximation subject to a matroid base constraint in matroids with two disjoint bases. Even for the special case of cardinality constraint, we prove it is impossible to achieve a 0.491-approximation. (Previously it was conceivable that a 1/2-approximation exists for these problems.) It is still an open question whether a 1/2-approximation is possible for unconstrained submodular maximization. Previous chapter Next chapter RelatedDetails Published:2011ISBN:978-0-89871-993-2eISBN:978-1-61197-308-2 https://doi.org/10.1137/1.9781611973082Book Series Name:ProceedingsBook Code:PR138Book Pages:xviii-1788
A central issue in applying auction theory in practice is the problem of dealing with budget-constrained agents. A desirable goal in practice is to design incentive compatible, individually rational, and … A central issue in applying auction theory in practice is the problem of dealing with budget-constrained agents. A desirable goal in practice is to design incentive compatible, individually rational, and Pareto optimal auctions while respecting the budget constraints. Achieving this goal is particularly challenging in the presence of nontrivial combinatorial constraints over the set of feasible allocations. Toward this goal and motivated by AdWords auctions, we present an auction for polymatroidal environments satisfying the above properties. Our auction employs a novel clinching technique with a clean geometric description and only needs an oracle access to the submodular function defining the polymatroid. As a result, this auction not only simplifies and generalizes all previous results, it applies to several new applications including AdWords Auctions, bandwidth markets, and video on demand. In particular, our characterization of the AdWords auction as polymatroidal constraints might be of independent interest. This allows us to design the first mechanism for Ad Auctions taking into account simultaneously budgets, multiple keywords and multiple slots.
We consider a monopolist seller with n heterogeneous items, facing a single buyer. The buyer hasa value for each item drawn independently according to(non-identical) distributions, and his value for a … We consider a monopolist seller with n heterogeneous items, facing a single buyer. The buyer hasa value for each item drawn independently according to(non-identical) distributions, and his value for a set ofitems is additive. The seller aims to maximize his revenue.It is known that an optimal mechanism in this setting maybe quite complex, requiring randomization [19] and menusof infinite size [15]. Hart and Nisan [17] have initiated astudy of two very simple pricing schemes for this setting:item pricing, in which each item is priced at its monopolyreserve; and bundle pricing, in which the entire set ofitems is priced and sold as one bundle. Hart and Nisan [17]have shown that neither scheme can guarantee more thana vanishingly small fraction of the optimal revenue. Insharp contrast, we show that for any distributions, thebetter of item and bundle pricing is a constant-factorapproximation to the optimal revenue. We further discussextensions to multiple buyers and to valuations that arecorrelated across items.
We study a combinatorial market design problem, where a collection of indivisible objects is to be priced and sold to potential buyers subject to equilibrium constraints. The classic solution concept … We study a combinatorial market design problem, where a collection of indivisible objects is to be priced and sold to potential buyers subject to equilibrium constraints. The classic solution concept for such problems is Walrasian equilibrium (WE), which provides a simple and transparent pricing structure that achieves optimal social welfare. The main weakness of the WE notion is that it exists only in very restrictive cases. To overcome this limitation, we introduce the notion of a combinatorial Walrasian equilibium (CWE), a natural relaxation of WE. The difference between a CWE and a (noncombinatorial) WE is that the seller can package the items into indivisible bundles prior to sale, and the market does not necessarily clear. We show that every valuation profile admits a CWE that obtains at least half the optimal (unconstrained) social welfare. Moreover, we devise a polynomial time algorithm that, given an arbitrary allocation, computes a CWE that achieves at least half its welfare. Thus, the economic problem of finding a CWE with high social welfare reduces to the algorithmic problem of social-welfare approximation. In addition, we show that every valuation profile admits a CWE that extracts a logarithmic fraction of the optimal welfare as revenue. Finally, to motivate the use of bundles, we establish strong lower bounds when the seller is restricted to using item prices only. The strength of our results derives partly from their generality---our results hold for arbitrary valuations that may exhibit complex combinations of substitutes and complements.
Central results in economics guarantee the existence of efficient equilibria for various classes of markets. An underlying assumption in early work is that agents are price-takers, i.e., agents honestly report … Central results in economics guarantee the existence of efficient equilibria for various classes of markets. An underlying assumption in early work is that agents are price-takers, i.e., agents honestly report their true demand in response to prices. A line of research in economics, initiated by Hurwicz (1972), is devoted to understanding how such markets perform when agents are strategic about their demands. This is captured by the Walrasian Mechanism that proceeds by collecting reported demands, finding clearing prices in the reported market via an ascending price tatonnement procedure, and returns the resulting allocation. Similar mechanisms are used, for example, in the daily opening of the New York Stock Exchange and the call market for copper and gold in London.
We study Bayesian mechanism design problems in settings where agents have budgets. Specifically, an agent's utility for an outcome is given by his value for the outcome minus any payment … We study Bayesian mechanism design problems in settings where agents have budgets. Specifically, an agent's utility for an outcome is given by his value for the outcome minus any payment he makes to the mechanism, as long as the payment is below his budget, and is negative infinity otherwise. This discontinuity in the utility function presents a significant challenge in the design of good mechanisms, and classical mechanisms fail to work in settings with budgets. The goal of this paper is to develop general reductions from budget-constrained Bayesian MD to unconstrained Bayesian MD with small loss in performance. We consider this question in the context of the two most well-studied objectives in mechanism design---social welfare and revenue---and present constant factor approximations in a number of settings. Some of our results extend to settings where budgets are private and agents need to be incentivized to reveal them truthfully.
We study a central problem in Algorithmic Mechanism Design: constructing truthful mechanisms for welfare maximization in combinatorial auctions with submodular bidders. Dobzinski, Nisan, and Schapira provided the first mechanism that … We study a central problem in Algorithmic Mechanism Design: constructing truthful mechanisms for welfare maximization in combinatorial auctions with submodular bidders. Dobzinski, Nisan, and Schapira provided the first mechanism that guarantees a non-trivial approximation ratio of O(log^2 m) [STOC'06], where m is the number of items. This was subsequently improved to O( log m log log m) [Dobzinski, APPROX'07] and then to O(m) [Krysta and Vocking, ICALP'12]. In this paper we develop the first mechanism that breaks the logarithmic barrier. Specifically, the mechanism provides an approximation ratio of O( m). Similarly to previous constructions, our mechanism uses polynomially many value and demand queries, and in fact provides the same approximation ratio for the larger class of XOS (a.k.a. fractionally subadditive) valuations. We also develop a computationally efficient implementation of the mechanism for combinatorial auctions with budget additive bidders. Although in general computing a demand query is NP-hard for budget additive valuations, we observe that the specific form of demand queries that our mechanism uses can be efficiently computed when bidders are budget additive.
Consider a gambler who observes a sequence of independent, non-negative random numbers and is allowed to stop the sequence at any time, claiming a reward equal to the most recent … Consider a gambler who observes a sequence of independent, non-negative random numbers and is allowed to stop the sequence at any time, claiming a reward equal to the most recent observation. The famous prophet inequality of Krengel, Sucheston, and Garling asserts that a gambler who knows the distribution of each random variable can achieve at least half as much reward, in expectation, as a "prophet" who knows the sampled values of each random variable and can choose the largest one. We generalize this result to the setting in which the gambler and the prophet are allowed to make more than one selection, subject to a matroid constraint. We show that the gambler can still achieve at least half as much reward as the prophet; this result is the best possible, since it is known that the ratio cannot be improved even in the original prophet inequality, which corresponds to the special case of rank-one matroids. Generalizing the result still further, we show that under an intersection of $p$ matroid constraints, the prophet's reward exceeds the gambler's by a factor of at most $O(p)$, and this factor is also tight.
We study the necessity of interaction between individuals for obtaining approximately efficient economic allocations. We view this as a formalization of Hayek's classic point of view that focuses on the … We study the necessity of interaction between individuals for obtaining approximately efficient economic allocations. We view this as a formalization of Hayek's classic point of view that focuses on the information transfer advantages that markets have relative to centralized planning. We study two settings: combinatorial auctions with unit demand bidders (bipartite matching) and combinatorial auctions with subadditive bidders. In both settings we prove that non-interactive protocols require exponentially larger communication costs than interactive ones, even those that only use a modest amount of interaction.
We study auctions with severe bounds on the communication allowed: each bidder may only transmit t bits of information to the auctioneer. We consider both welfare- and profit-maximizing auctions under … We study auctions with severe bounds on the communication allowed: each bidder may only transmit t bits of information to the auctioneer. We consider both welfare- and profit-maximizing auctions under this communication restriction. For both measures, we determine the optimal auction and show that the loss incurred relative to unconstrained auctions is mild. We prove non-surprising properties of these kinds of auctions, e.g., that in optimal mechanisms bidders simply report the interval in which their valuation lies in, as well as some surprising properties, e.g., that asymmetric auctions are better than symmetric ones and that multi-round auctions reduce the communication complexity only by a linear factor.
We consider the problem of designing a revenue-maximizing auction for a single item, when the values of the bidders are drawn from a correlated distribution. We observe that there exists … We consider the problem of designing a revenue-maximizing auction for a single item, when the values of the bidders are drawn from a correlated distribution. We observe that there exists an algorithm that finds the optimal randomized mechanism that runs in time polynomial in the size of the support. We leverage this result to show that in the oracle model introduced by Ronen and Saberi [FOCS'02], there exists a polynomial time truthful in expectation mechanism that provides a (1.5+ε)-approximation to the revenue achievable by an optimal truthful-in-expectation mechanism, and a polynomial time deterministic truthful mechanism that guarantees 5/3 approximation to the revenue achievable by an optimal deterministic truthful mechanism.
In Combinatorial Public Projects, there is a set of projects that may be undertaken, and a set of self-interested players with a stake in the set of projects chosen. A … In Combinatorial Public Projects, there is a set of projects that may be undertaken, and a set of self-interested players with a stake in the set of projects chosen. A public planner must choose a subset of these projects, subject to a resource constraint, with the goal of maximizing social welfare. Combinatorial Public Projects has emerged as one of the paradigmatic problems in Algorithmic Mechanism Design, a field concerned with solving fundamental resource allocation problems in the presence of both selfish behavior and the computational constraint of polynomial time. We design a polynomial-time, truthful-in-expectation, (1-1/e)-approximation mechanism for welfare maximization in a fundamental variant of combinatorial public projects. Our results apply to combinatorial public projects when players have valuations that are matroid rank sums (MRS), which encompass most concrete examples of submodular functions studied in this context, including coverage functions and matroid weighted-rank functions. Our approximation factor is the best possible, assuming P ≠ NP. Ours is the first mechanism that achieves a constant factor approximation for a natural NP-hard variant of combinatorial public projects.
In this paper we show that payment computation essentially does not present any obstacle in designing truthful mechanisms, even for multi-parameter domains, and even when we can only call the … In this paper we show that payment computation essentially does not present any obstacle in designing truthful mechanisms, even for multi-parameter domains, and even when we can only call the allocation rule once. We present a general reduction that takes any allocation rule which satisfies "cyclic monotonicity" (a known necessary and sufficient condition for truthfulness) and converts it to a truthful mechanism using a single call to the allocation rule, with arbitrarily small loss to the expected social welfare.
Individuals working towards a goal often exhibit time inconsistent behavior, making plans and then failing to follow through. One well-known model of such behavioral anomalies is present-bias discounting: individuals over-weight … Individuals working towards a goal often exhibit time inconsistent behavior, making plans and then failing to follow through. One well-known model of such behavioral anomalies is present-bias discounting: individuals over-weight present costs by a bias factor. This model explains many time-inconsistent behaviors, but can make stark predictions in many settings: individuals either follow the most efficient plan for reaching their goal or procrastinate indefinitely. We propose a modification in which the present-bias parameter can vary over time, drawn independently each step from a fixed distribution. Following Kleinberg and Oren (2014), we use a weighted {\it task graph} to model task planning, and measure the cost of procrastination as the relative expected cost of the chosen path versus the optimal path. We use a novel connection to optimal pricing theory to describe the structure of the worst-case task graph for any present-bias distribution. We then leverage this structure to derive conditions on the bias distribution under which the worst-case ratio is exponential (in time) or constant. We also examine conditions on the task graph that lead to improved procrastination ratios: graphs with a uniformly bounded distance to the goal, and graphs in which the distance to the goal monotonically decreases on any path.
Bulow and Klemperer's well-known result states that, in a single-item auction where the n bidders' values are independently and identically drawn from a regular distribution, the Vickrey auction with one … Bulow and Klemperer's well-known result states that, in a single-item auction where the n bidders' values are independently and identically drawn from a regular distribution, the Vickrey auction with one additional bidder (a duplicate) extracts at least as much revenue as the optimal auction without the duplicate. Hartline and Roughgarden, in their influential 2009 paper, removed the requirement that the distributions be identical, at the cost of allowing the Vickrey auction to recruit n duplicates, one from each distribution, and relaxing its revenue advantage to a 2-approximation. In this work we restore Bulow and Klemperer's number of duplicates in Hartline and Roughgarden's more general setting. We show that recruiting a duplicate from one of the distributions suffices for the Vickrey auction to $10$-approximate the optimal revenue. We also show that in a k-unit auction, recruiting k duplicates suffices for the VCG auction to $O(1)$-approximate the optimal revenue. We also tighten the analysis for Hartline and Roughgarden's Vickrey auction with n duplicates. We show that, for two distributions, the Vickrey auction with two duplicates obtains at least $3/4$ of the optimal revenue. This is tight by meeting a lower bound by Hartline and Roughgarden. En route, we obtain a transparent analysis of their $2$-approximation, by a natural connection to Ronen's lookahead auction.
Submodular maximization generalizes many fundamental problems in discrete optimization, including Max-Cut in directed/undirected graphs, maximum coverage, maximum facility location and marketing over social networks. In this paper we consider the … Submodular maximization generalizes many fundamental problems in discrete optimization, including Max-Cut in directed/undirected graphs, maximum coverage, maximum facility location and marketing over social networks. In this paper we consider the problem of maximizing any submodular function subject to $d$ knapsack constraints, where $d$ is a fixed constant. We establish a strong relation between the discrete problem and its continuous relaxation, obtained through {\em extension by expectation} of the submodular function. Formally, we show that, for any non-negative submodular function, an $\alpha$-approximation algorithm for the continuous relaxation implies a randomized $(\alpha - \eps)$-approximation algorithm for the discrete problem. We use this relation to improve the best known approximation ratio for the problem to $1/4- \eps$, for any $\eps > 0$, and to obtain a nearly optimal $(1-e^{-1}-\eps)-$approximation ratio for the monotone case, for any $\eps>0$. We further show that the probabilistic domain defined by a continuous solution can be reduced to yield a polynomial size domain, given an oracle for the extension by expectation. This leads to a deterministic version of our technique.
We present a computationally-efficient truthful mechanism for combinatorial auctions with subadditive bidders that achieves an $O((\log\!\log{m})^3)$-approximation to the maximum welfare in expectation using $O(n)$ demand queries; here $m$ and $n$ … We present a computationally-efficient truthful mechanism for combinatorial auctions with subadditive bidders that achieves an $O((\log\!\log{m})^3)$-approximation to the maximum welfare in expectation using $O(n)$ demand queries; here $m$ and $n$ are the number of items and bidders, respectively. This breaks the longstanding logarithmic barrier for the problem dating back to the $O(\log{m}\cdot\log\!\log{m})$-approximation mechanism of Dobzinski from 2007. Along the way, we also improve and considerably simplify the state-of-the-art mechanisms for submodular bidders.