I am a faculty member at the School of Computer Science
at Tel Aviv University.
I am a part of the Theory of Computation Group at TAU, and a member at the Israeli
Center of Research Excellence in Algorithms (ICORE) and the Check Point Institute for
Information Security.
My main research interests are Cryptography and Computational
Complexity.
In a multi-party fair coin-flipping protocol, the parties output a common (close to) unbiased bit, even when some corrupted parties try to bias the output.
In this work we focus on the case of arbitrary number of corrupted parties. Cleve [STOC 1986] has shown that in any such m-round coin-flipping protocol, the corrupted parties can bias the honest parties'
common output bit by Ω(1/m). For more than two decades, however, the best known coin-flipping protocol was the one of Awerbuch, Blum, Chor, Goldwasser, and Micali [Manuscript 1985], who presented a t-party, m-round protocol with bias
Θ(√t/m). This was changed by a recent breakthrough result of Moran et al. [TCC 2009],
who constructed an m-round, two-party coin-flipping protocol with optimal bias Θ(1/m).
Recently, Haitner and Tsfadia [STOC 14] constructed an m-round, three-party coin-flipping protocol with bias O((log m)^{3}/m.
Still for the case of more than three parties, the best known protocol remained the Θ(√t/m)-bias protocol of Awerbuch et al.
We make a step towards eliminating the above gap, presenting a t-party, m-round coin-flipping protocol,
with bias O(t^3 * 2^{t} * √log m/ m^{.5 + 1/(2t-1-2)}).
This improves upon the protocol of Awerbuch et al. for any t< loglog m /2, and in particular for t ∈ O(1), it is a 1/m^{.5+ Θ(1)}-bias protocol. For the three-party case,
is is a O(√log m/m)-bias protocol, improving over the protocol of Haitner and Tsfadia .
Our protocol generalizes that of Haitner and Tsfadia
by presenting an appropriate ``defense protocols'' for the remaining parties to interact in, in the case that some parties abort or caught cheating.
We analyze the new protocols by presenting a new paradigm for analyzing fairness of coin-flipping protocols. We map the set of adversarial strategies that try to bias the honest parties outcome in the protocol to the set of the feasible solutions of a linear program. The gain each strategy achieves is the value of the corresponding solution. We then bound the the optimal value of the linear program by constructing a feasible solution to its dual.
Ran Cohen and Iftach Haitner and Eran Omri and Lior Rotem
Characterization of Secure Multiparty Computation Without Broadcast
TCC 2016-A [link]
Draft of full version [pdf]
A major challenge in the study of cryptography is characterizing the necessary and sufficient assumptions required to carry out a given cryptographic task. The focus of this work is the necessity of a broadcast channel for securely computing symmetric functionalities (where all the parties receive the same output) when one third of the parties, or more, might be corrupted. Assuming all parties are connected via a peer-to-peer network, but no broadcast channel (nor a secure setup phase) is available, we prove the following characterization:
Iftach Haitner and Yuval Ishai and Eran Omri and and Ronen Shaltiel
Parallel Hashing via List Recoverability
Crypto 2015 [pdf]
Motivated by the goal of constructing efficient hash functions, we investigate the possibility of hashing a long message by only making parallel, non-adaptive calls to a hash function on short messages. Our main result is a simple construction of a collision-resistant hash function h:{0,1}^{k} → {0,1}^{k} that makes a polynomial number of parallel calls to a \emph{random} function f:{0,1}^{k} → {0,1}^{k} , for any polynomial n=n(k) . This should be compared with the traditional use of a Merkle hash tree, that requires at least log(n/k) rounds of calls to f, and with a more complex construction of Maurer and Tessaro (Crypto 2007) that requires two rounds of calls to f. We also show that our hash function h satisfies a relaxed form of the notion of indifferentiability of Maurer et al. (TCC 2004) that suffices for implementing the Fiat-Shamir paradigm. As a corollary, we get sublinear-communication non-interactive arguments for NP that only make two rounds of calls to a small random oracle. An attractive feature of our construction is that h can be implemented by boolean circuits that only contain parity gates in addition to the parallel calls to f. Thus, we get the first domain-extension scheme which is degree-preserving in the sense that the algebraic degree of h over the binary field is equal to that of f Our construction makes use of list-recoverable codes, a generalization of list-decodable codes that is closely related to the notion of randomness condensers. We show that list-recoverable codes are necessary for any construction of this type.
Iftach Haitner, Jonathan J. Hoch, Omer Reingold
and Gil Segev
Finding Collisions in Interactive Protocols — Tight Lower Bounds on the Round Complexity and Communication Complexityof Statistically Hiding Commitments
SIAM Journal of Computing 2015 [link]
Draft of full version [pdf]
We study the round complexity and communication complexity of various cryptographic protocols. We give tight lower bounds on the round complexity and communication complexity of any fully black-box reduction of a statistically hiding commitment scheme from one-way permutations, and from trapdoor permutations. As a corollary, we derive similar tight lower bounds for several other cryptographic protocols, such as single-server private information retrieval, interactive hashing, and oblivious transfer that guarantees statistical security for one of the parties. Our techniques extend the collision-finding oracle due to Simon [EUROCRYPT 98] to the setting of interactive protocols, and the reconstruction paradigm of In their seminal work, Impagliazzo and Rudich (STOC’89) showed that no key-agreement protocol exists in the random-oracle model, yielding that key agreement cannot be black-box reduced to one-way functions. In this work, we generalize their result, showing that, to a large extent, no-private-input, semi-honest, two-party functionalities that can be securely implemented in the random oracle model can be securely implemented information theoretically (where parties are assumed to be all powerful, and no oracle is given). Using a recent information-theoretic impossibility result by McGregor et al. (FOCS’10), our result yields that certain functionalities (e.g. inner product) cannot be computed both in an accurately and in a differentially private manner in the random oracle model, implying that protocols for computing these functionalities cannot be black-box reduced to the existence of one-way functions. Gennaro and Trevisan [FOCS 00].
Itay Berman and Iftach Haitner and Aris Tentes
Coin Flipping of Any Constant Bias Implies One-Way Functions
[slides]
STOC 2014 [pdf]
Draft of full version [pdf]
We show that the existence of a coin-flipping protocol safe against any non-trivial constant bias (e.g., .499), implies the existence of one way functions. This improves upon a recent result of Haitner and Omri [FOCS ’11] , who proved this implication for protocols with bias √2− ^{1}⁄_{2} ~.207. Unlike the result of Haitner and Omri, our result holds also for weak coin-flipping protocols.
In a multiparty, fair coin-flipping protocol, the parties output a common (close to) unbiased bit, even when some corrupted parties try to bias the output.
Cleve [STOC 1986] has shown that in the case of dishonest
majority (i.e., at least half of the parties can be corrupted), the corrupted parties can bias the honest parties common output bit
by Ω(1/m), in any m-round, coin-flipping protocol. For more than two decades, however, the best known
coin-flipping protocols had bias Θ(√t)
against dishonest majority, where t is the number of
corrupted parties. This was changed by a recent breakthrough result of Moran et al. [TCC 2009],
who constructed an m-round,
two-party coin-flipping protocol with optimal bias Θ(1/m). Where in a
subsequent work, Beimel et al. [Crypto 2010] extended this result for the multiparty
case, in which less than ^{2}⁄_{3} of case, in which less than ^{2}⁄_{3} of
the parties can be corrupted. More specifically, for any t < ^{2}⁄_{3}
· k , they presented an m-round, k-party
protocol, with bias Θ( 2^{2t−k}/m) against (up to) t
corrupted parties. Still for the case ^{2}⁄_{3}
(or more) corrupted parties, the best known protocol had bias Θ(t/√m). In particular,
this was the state of affairs for the natural three-party case (against
two corrupted parties).
We make a step towards eliminating the above gap, presenting an m-round, three-party,
coin-flipping protocol, with bias O(log 3 m)/m. Our approach (which we also apply for the two-party case), does not
follow the “threshold round” paradigm used in the work of Moran et al. and Beimel et al.,
but rather is a takeoff of the majority protocol of Cleve (used to obtain the Θ(√t)-bias protocol,
mentioned above).
Itay Berman and Iftach Haitner and
Ilan Komargodski and Moni Naor
Hardness Preserving Reductions via Cuckoo Hashing
TCC 2013 [pdf]
Draft of full version [pdf]
Itay Berman and Iftach Haitner
From Non-Adaptive to Adaptive Pseudorandom Functions
Journal of Cryptology 2013 [link]
TCC 2012 [pdf]
Draft of full version [pdf]
Iftach Haitner and Eran Omri
Coin Flipping with Constant Bias ImpliesOne-Way Functions
[slides]
SIAM Journal of Computing 2013 [link]
FOCS 2011 [pdf]
Draft of full version [pdf]
It is well known (c.f., Impagliazzo and Luby [FOCS '89]) that the existence
of almost all ``interesting" cryptographic applications, i.e., ones that cannot hold information theoretically, implies one-way functions.
An important exception where the above implication is not known, however, is the case of coin-flipping protocols. Such protocols allow
honest parties to mutually flip an unbiased coin, while guaranteeing that even a cheating (efficient) party cannot bias the output of the
protocol by much. While Impagliazzo and Luby proved that coin-flipping protocols that are safe against negligible bias do imply one-way
functions, and, very recently, Maji, Prabhakaran, and Sahai [FOCS '10]
proved the same for constant-round protocols (with any non-trivial bias). For the general case, however, no such implication was known.
We make progress towards answering the above fundamental question, showing that (strong) coin-flipping protocols safe against a constant
bias (concretely,√2− ^{1}⁄_{2} ~.207) imply one-way functions.
Iftach Haitner, Omer Reingold and Salil Vadhan
Efficiency Improvements in Constructing Pseudorandom Generators from One-way Functions
[slides]
SIAM Journal of Computing 2013 [link]
Annual ACM Symposium on Theory of Computing (STOC), 2010 [pdf]
Draft of full version [pdf]
We give a new construction of pseudorandom generators from any one-way function. The construction achieves better parameters and is simpler than that given in the seminal work of Håstad, Impagliazzo, Levin and Luby [SICOMP '99]. The key to our construction is a new notion of next-block pseudoentropy, which is inspired by the notion of ``inaccessible entropy'' recently introduced in Haitner, Reingold,Vadhan and Wee [STOC '09]. An additional advantage over all previous constructions is that our pseudorandom generators are highly parallelizable and invoke the one-way function in a non-adaptive manner. Using Applebaum, Ishai and Kushilevitz [SICOMP '06], this implies the existence of pseudorandom generators in NC^{0} based on the existence of one-way functions in NC^{1}.
Iftach Haitner
A Parallel Repetition Theorem for Any Interactive Argument
[slides]
SIAM Journal of Computing 2013 [link]
IEEE Symposium on Foundations of Computer Science (FOCS), 2009
Draft of full version [pdf]
The question whether or not parallel repetition reduces the soundness error is a fundamental question in the theory of protocols.
While parallel repetition reduces (at an exponential rate) the error in interactive proofs and (at a weak exponential rate) in special cases of interactive
arguments (e.g., 3-message protocols - Bellare, Impagliazzo and Naor [FOCS '97], and
public-coin protocols - Håstad, Pass, Pietrzak and
WikstrÄom [TCC '10]), Bellare et. al gave example of interactive arguments for which parallel repetition does not reduce the soundness
error at all.
We show that by slightly modifying any interactive argument, in a way that preserves its completeness and only
slightly deteriorates its soundness, we get a protocol for which parallel repetition does reduce the error at a weak exponential rate.
In this modified version, the verifier flips at the beginning of each round an (1 - (1/4m), 1/4m) biased
coin (i.e., 1 is tossed with probability 1/4m), where m is the round complexity of the
(original) protocol. If the coin is one, the verifier halts the interaction and accepts, otherwise it sends the same message that the
original verifier would. At the end of the protocol (if reached), the
verifier accepts if and only if the original verifier would.
Yevgeniy Dodis, Iftach Haitner and Aris Tentes
On the Instantiability of Hash-and-Sign RSA Signatures
[slides]
TCC 2012 [pdf]
Draft of full version [pdf]
The hash-and-sign RSA signature is one of the most elegant and well known signatures schemes, extensively used in a wide variety of cryptographic
applications. Unfortunately, the only existing analysis of this popular signature scheme is in the random oracle model, where the resulting
idealized signature is known as the RSA Full Domain Hash signature scheme (RSA-FDH). In fact, prior work has shown several
``uninstantiability'' results for various abstractions of RSA-FDH, where the RSA function was replaced by a family of trapdoor random
permutations, or the hash function instantiating the random oracle could not be keyed. These abstractions, however, do not allow the
reduction and the hash function instantiation to use the algebraic properties of RSA function, such as the multiplicative group structure
of Z_{n}*. In contrast, the multiplicative property of
the RSA function is critically used in many standard model analyses of various RSA-based schemes.
Motivated by closing this gap, we consider the setting where the RSA function representation is generic (i.e., black-box) but
multiplicative, whereas the hash function itself is in the standard model, and can be keyed and exploit the multiplicative properties of
the RSA function. This setting abstracts all known techniques for designing provably secure RSA-based signatures in the standard model, and aims to address
the main limitations of prior uninstantiability results. Unfortunately, we show that it is still impossible to reduce the
security of RSA-FDH to any natural assumption even in our model. Thus, our result suggests that in order to prove the security of a given instantiation
of RSA-FDH, one should use a non-black box security proof, or use specific properties of the RSA group
that are not captured by its multiplicative structure alone.
We complement our negative result with a positive result, showing that the RSA-FDH signatures can be proven
secure under the standard RSA assumption, provided that the number of signing queries is a-priori bounded.
Iftach Haitner, Yuval Ishai, Eyal
Kushilevitz, Yehuda Lindell and Erez Petrank
Black-Box Constructions of Protocols for Secure Computation
SIAM Journal of Computing 2011 [link]
Draft of full version [pdf]
It is well known that secure computation without an honest majority requires computational assumptions. An interesting question that
therefore arises relates to the way such computational assumptions are used. Specifically, can the secure protocol use the underlying
primitive (e.g., a one-way trapdoor permutation) in a black-box way, treating it as an oracle, or must it be non-black-box (by referring to
the code that computes the primitive)?
Despite the fact that many general constructions of cryptographic
schemes refer to the underlying primitive in a black-box way only,
there are some constructions that are inherently nonblack-box. Indeed,
all known constructions of protocols for general secure computation
that are secure in the presence of a malicious adversary and without an
honest majority use the underlying primitive in a nonblack-box way
(requiring to prove in zero-knowledge statements that relate to the
primitive).
In this paper, we study whether such nonblack-box use is essential. We
answer this question in the negative. Concretely, we present a fully
black-box reduction from oblivious transfer with security against
malicious parties to oblivious transfer with security against
semi-honest parties. As a corollary, we get the first constructions of
general multiparty protocols (with security against malicious
adversaries and without an honest majority) which only make a black-box
use of semi-honest oblivious transfer, or alternatively a black-box use
of lower-level primitives such as enhanced trapdoor permutations or
homomorphic encryption.
Boaz Barak, Iftach Haitner, Dennis Hofheinz
and Yuval Ishai
Bounded Key-Dependent Message Security
Advances in Cryptology - Eurocrypt, 2010 [pdf]
Draft of full version [pdf]
We construct the first public-key encryption scheme that is proven secure
(in the standard model, under standard assumptions) even when the attacker gets access to encryptions of arbitrary efficient functions of the secret key.
Specifically, under either the DDH or LWE assumption, for every polynomials L and N we obtain a
public-key encryption scheme that resists key-dependent message (KDM) attacks for up to N(k)
public keys and functions circuit size up to L(k), where k denotes the size of the secret
key. We call such a scheme bounded KDM secure. Moreover, we show that our scheme suffices for
one of the important applications of KDM security: ability to securely instantiate symbolic protocols with
axiomatic proofs of security.
We also observe that any fully homomorphic encryption scheme which
additionally enjoys circular security and circuit privacy is fully KDM secure in the sense that the encryption and
decryption algorithms can be independent of the polynomials L and N as
above. Thus, the recent fully homomorphic encryption scheme of Gentry [STOC 2009] is fully KDM
secure under certain non-standard hardness assumptions. Previous works obtained either full KDM security in the random oracle
model ( Black et al. [SAC 2002]) or security with respect to a very restricted
class of functions (e.g., clique/circular security and affine functions, Boneh et al. [CRYPTO 2008] and
Applebaum et al. [CRYPTO 2009].
Our main result is based on a combination of the circular-secure
encryption scheme of either Boneh et al. or Applebaum et al. with
Yao's garbled circuit construction.
Finally, we extend the impossibility result of Haitner and Holenstein [TCC 2009], showing that it
is impossible to prove KDM security against a family of query functions that contains exponentially hard pseudorandom functions,
using only black-box access to the query function and the adversary attacking the scheme. This proves that the non-black-box
usage of the query function in our proof of security makes to the KDM query function is inherent.
Iftach Haitner, Thomas Holenstein, Omer
Reingold, Salil Vadhan and Hoeteck Wee
Universal One-Way Hash Functions via Inaccessible Entropy
Advances in Cryptology - Eurocrypt, 2010 [pdf]
Draft of full version [pdf]
This paper revisits the construction of Universally One-Way Hash Functions (UOWHFs) from any one-way function due to Rompel [STOC 1990}.
We give a simpler construction of UOWHFs which also obtains better efficiency and security. The construction exploits a strong connection to the recently
introduced notion of inaccessible entropy Haitner et
al. [STOC 2009]. With this perspective, we observe that a small tweak of
any one-way function f is already a weak form of a UOWHF: Consider
F(x,i) that outputs the i-bit long
prefix of f(x). If F were a UOWHF then given a
random x and i it would be hard to come up with x' ≠ x such
that F(x,i)=F(x',i). While this may not be the case, we show (rather easily) that it is hard
to sample x' with almost full entropy among all the possible such values of x'. The rest of our construction
simply amplifies and exploits this basic property.
With this and other recent works we have that the constructions of
three fundamental cryptographic primitives (Pseudorandom Generators,
Statistically Hiding Commitments and UOWHFs) out of one-way functions
are to a large extent unified. In particular, all three constructions
rely on and manipulate computational notions of entropy in similar
ways. Pseudorandom Generators rely on the well-established notion of
pseudoentropy, whereas Statistically Hiding Commitments and UOWHFs rely
on the newer notion of inaccessible entropy.
Iftach Haitner, Mohammad Mahmoody and David Xiao
New Sampling Protocol and Applications to Basing Cryptographic Primitives on the Hardness of NP
IEEE Conference on Computational Complexity (CCC), 2010 [pdf]
Draft of full version [pdf]
We investigate the question of what languages can be decided efficiently with the help of a recursive collision-finding oracle. Such an oracle can be used to break
collision-resistant hash functions or, more generally, statistically hiding commitments. The
oracle we consider Sam_{d,} where d is the
recursion depth, is based on the identically-named oracle defined in the work of
Haitner et al. (FOCS '07). Our main result is a constant-round public-coin protocol AM-Sam
that allows an efficient verifier to emulate a Sam_{d}
oracle for any constant depth d = O(1) with the help of a BPP^{NP}
prover. AM-Sam allows us to conclude that if L
is decidable by a k-adaptive randomized oracle algorithm with access to a Sam_{O(1)}
oracle, then L∈AM[k]∩coAM[k].
The above yields the following corollary: assume there exists an
O(1)-adaptive reduction that bases constant-round statistically hiding commitment on NP-hardness,
then NP ⊆ coAM and the polynomial hierarchy collapses. The same result holds for any
primitive that can be broken by Sam_{O(1)} including collision-resistant hash functions and O(1)-round
oblivious transfer where security holds statistically for one of the parties. We also obtain
non-trivial (though weaker) consequences for k-adaptive reductions for any k = poly(n).
Prior to our work, most results in this research direction either applied only to non-adaptive reductions (Bogdanov and Trevisan
[SIAM J. of Comp. '06]) or to primitives with special regularity properties (Brassard [FOCS '79], Akavia et al. [FOCS '06].
The main technical tool we use to prove the above is a new constant-round public-coin protocol (SampleWithSize)
that we believe may be interesting in its own right, and that guarantees the following. Given an efficient function
f on n
bits, let D be the output distribution D = f(U_{n}),
then SampleWithSize allows an efficient verifier Arthur to use an all-powerful prover Merlin's help to sample a random y ← D>
along with a good multiplicative approximation of the probability p_{y}
= Pr_{y' ← D}[y' = y]. The crucial feature of SampleWithSize
is that it extends even to distributions of the form D = f(U_{S}),
where U_{S}
is the uniform distribution on an efficiently decidable subset S ⊆{0.1}^{n}
(such D> are called efficiently samplable with post-selection), as long
as the verifier is also given a good approximation of the value |S|.
Iftach Haitner, Minh-Huyen Nguyen, Shien Jin Ong, OmercReingold and Salil Vadhan
Statistically Hiding Commitments and Statistical Zero-Knowledge Arguments from Any One-Way Function
SIAM Journal of Computing 2009 [link]
Draft of full version [pdf]
We give a construction of statistically hiding commitment schemes (those in which the hiding property holds against even computationally unbounded adversaries) under the minimal complexity assumption that one-way functions exist. Consequently, one-way functions suffice to give statistical zero-knowledge arguments for any NP statement (whereby even a computationally unbounded adversarial verifier learns nothing other than the fact that the assertion being proven is true, and no polynomial-time adversarial prover can convince the verifier of a false statement). These results resolve an open question posed by Naor et al. [J. Cryptology 98].
Iftach
Haitner, Omer Reingold,
Salil Vadhan and Hoeteck Wee
Inaccessible Entropy
[slides]
[video (appx. 1 hour)]
Annual ACM Symposium on Theory of Computing (STOC), 2009 [pdf]
Draft of full version [pdf]
We put forth a new computational notion of entropy, which measures the (in)feasibility of sampling high entropy strings that are consistent with a given protocol. Specifically, we say that the i’th round of a protocol (A,B) has accessible entropy at most k, if no polynomial-time strategy A* can generate messages for A> such that the entropy of its message in the i’th round has entropy greater than k when conditioned both on prior messages of the protocol and on prior coin tosses of A*. As applications of thisnotion, we
Iftach Haitner, Alon Rosen and Ronen Shaltiel
b>On the (Im)Possibility of Arthur-Merlin Witness Hiding Protocols
[slides]]
Theory of Cryptography Conference (TCC), 2009 [pdf]
The concept of witness-hiding suggested by Feige and Shamir [STOC 90] is a natural relaxation of zero-knowledge. In this paper we
identify languages and distributions for which many known constant-round
public-coin protocols with negligible soundness cannot be shown to witness-hiding using
black-box techniques. In particular, our results imply that it is impossible to prove that
parallel repetition of 3-Colorability or Hamiltonicity is witness-hiding for distributions that support instances with exactly one witness, if the
proof of security is by a black-box reduction that is independent of the choice of the commitment scheme used in the protocol. This lower bound
conceptually matches an upper bound of Feige and Shamir that uses such a black-box reduction to show that parallel repetition of 3-Colorability
or Hamiltonicity is witness-hiding for distributions with ``two independent witnesses.
We also consider black-box reductions for 3-Colorability or Hamiltonicity that depend on a specific
implementation of the commitment scheme. While we cannot rule out such reductions completely, we show that
``natural reductions" cannot bypass the limitations above.
Our proofs use techniques developed by Goldreich and Krawczyk [ICALP 90] for the
case of zero knowledge. The setup of witness-hiding, however, presents new technical and
conceptual difficulties that do not come up in the setup of zero-knowledge. The
high level idea is that if a black-box reduction establishes the witness-hiding
property for a protocol that is also a proof of knowledge, then this latter
property can be used ``against the reduction" to find witnesses unconditionally.
Iftach Haitner and Thomas Holenstein
On the (Im)Possibility of Key Dependent Encryption
[slides]
Theory of Cryptography Conference (TCC), 2009 [pdf]
Draft of full version [pdf]
We study the possibility of constructing encryption schemes secure under messages that are chosen depending on the key k of the encryption scheme itself. We give the following separation results that hold both in the private and in the public key settings:
Iftach Haitner
New Implications and Improved Efficiency of Constructions Based on One-way Functions
PhD. Thesis, 2008
[pdf]
Iftach Haitner
Semi-Honest to Malicious Oblivious Transfer -- The Black-Box Way
[slides]
Theory of Cryptography Conference (TCC), 2008 [pdf]
Full version is part of Black-Box Constructions of Protocols for Secure Computation
Until recently, all known constructions of oblivious transfer protocols based on general hardness assumptions had the following form. First, the hardness assumption is used in a black-box manner (i.e., the construction uses only the input/output behavior of the primitive guaranteed by the assumption) to construct a semi-honest oblivious transfer, a protocol whose security is guaranteed to hold only against adversaries that follow the prescribed protocol. Then, the latter protocol is ``compiled" into a (malicious) oblivious transfer using non-black techniques (a Karp reduction is carried in order to prove an NP statement in zero-knowledge). In their recent breakthrough result, Ishai, Kushilevitz, Lindel and Petrank [STOC '06] deviated from the above paradigm, presenting a black-box reduction from oblivious transfer to enhanced trapdoor permutations and to homomorphic encryption. Here we generalize their result, presenting a black-box reduction from oblivious transfer to semi-honest oblivious transfer. Consequently, oblivious transfer can be black-box reduced to each of the hardness assumptions known to imply a semi-honest oblivious transfer in a black-box manner. This list currently includes beside the hardness assumptions used by Ishai et al. also the existence of families of dense trapdoor permutations and of non trivial single-server private information retrieval.
Iftach Haitner, Jonathan J. Hoch and Gil Segev
A Linear Lower Bound on the Communication Complexity of Single-Server Private Information Retrieval
[slides]
Theory of Cryptography Conference (TCC), 2008 [pdf]
Full version is part of Finding Collisions in Interactive Protocols — Tight Lower Bounds on the Round Complexity and
Communication Complexity of Statistically Hiding Commitments
We study the communication complexity of single-server Private Information Retrieval (PIR) protocols that are based on fundamental cryptographic
primitives in a black-box manner. In this setting, we establish a tight lower bound on the number of bits communicated by the server in any
polynomially-preserving construction that relies on trapdoor permutations. More specifically, our main result states that in such
constructions Ω(n) bits must be communicated by the server, where n
is the size of the server's database, and this improves the Ω(n / log n) lower bound due to
Haitner, Hoch, Reingold and Segev [FOCS '07]. Therefore, in the setting under consideration, the naive solution in which the user downloads the
entire database turns out to be optimal up to constant multiplicative factors. We note that the lower bound we establish holds for the most
generic form of trapdoor permutations, including in particular enhanced trapdoor permutations.
Technically speaking, this paper consists of two main contributions from which our lower bound is obtained. First, we derive a tight lower
bound on the number of bits communicated by the sender during the commit stage of any black-box construction of a statistically-hiding
bit-commitment scheme from a family of trapdoor permutations. This lower bound asymptotically matches the upper bound provided by the
scheme of Naor, Ostrovsky, Venkatesan and Yung [CRYPTO '92]. Second, we improve the efficiency of the reduction of statistically-hiding
commitment schemes to low-communication single-server PIR, due to Beimel, Ishai, Kushilevitz and Malkin [STOC '99]. In particular, we
present a reduction that essentially preserves the communication complexity of the underlying single-server PIR protocol.
Iftach Haitner, Jonathan J. Hoch, Omer Reingold
and Gil Segev
Finding Collisions in Interactive Protocols -- A Tight Lower Bound on the Round Complexity of Statistically-Hiding Commitments
[slides]
IEEE Symposium on Foundations of Computer Science (FOCS), 2007 [pdf]
Full version is part of Finding Collisions in Interactive Protocols — Tight Lower Bounds on the Round Complexity and
Communication Complexity of Statistically Hiding Commitments
We study the round complexity of various cryptographic protocols. Our main result is a tight lower bound on
the round complexity of any fully-black-box construction of a statistically-hiding commitment scheme from one-
way permutations, and even from trapdoor permutations. This lower bound matches the round complexity of the
statistically-hiding commitment scheme due to Naor, Ostrovsky, Venkatesan and Yung [CRYPTO ’92].
As a corollary, we derive similar tight lower bounds for several other cryptographic protocols, such as single-server private infor-
mation retrieval, interactive hashing, and oblivious transfer that guarantees statistical security for one of the parties.
Our techniques extend the collision-finding oracle due to Simon [EUROCRYPT ’98] to the setting of interactive protocols
(our extension also implies an alternative proof for the main property of the original oracle). In addition, we substantially extend the reconstruction paradigm of
Gennaro and Trevisan (FOCS ’00]. In both cases, our extensions are quite delicate and may be found useful in proving
additional black-box separation results.
Iftach Haitner and Omer Reingold
Statistically Hiding Commitment from Any One-Way Function
[slides]
Annual ACM Symposium on Theory of Computing (STOC), 2007 [pdf]
Full version is part of Statistically Hiding Commitments and Statistical Zero-Knowledge Arguments from Any One-Way Function
We give a construction of statistically-hiding commitment schemes (ones where the hiding property holds information theoretically), based on the minimal cryptographic assumption that one-way functions exist. Our construction employs two-phase commitment schemes, recently constructed by Nguyen, Ong and Vadhan [FOCS `06], and universal one-way hash functions introduced and constructed by Naor and Yung [STOC `89] and Rompel [STOC `90].
Iftach Haitner and Omer Reingold
A New Interactive Hashing Theorem
[slides]
Journal of Cryptology 2014 [link]
IEEE Conference on Computational Complexity(CCC), 2007 [pdf]
Draft of full version [pdf]
Interactive hashing, introduced by Naor, Ostrovsky, Venkatesan and Yung [CRYPTO '92],
plays an important role in many cryptographic protocols. In particular, interactive hashing is a major component in all known constructions
(which are based on general one-way permutations and one-way functions) of statistically hiding commitment schemes and of statistical
zero-knowledge arguments. Interactive hashing with respect to a one-way permutation f, is a
two-party protocol that enables a sender that knows y=f(x) to transfer a random hash
z=h(y) to a receiver. The receiver is guaranteed that the sender is committed to y (in the sense that it cannot
come up with x and x' such that f(x) ≠ f(x'), but h(f(x))=h(f(x'))=z). The sender
is guaranteed that the receiver does not learn any additional information on y. In particular, when h is a two-to-one hash
function, the receiver does not learn which of the two preimages {y,y'}=h^{-1}(z)
is the one the sender can invert with respect to f.
This paper reexamines the notion of interactive hashing. We give an alternative proof for the Naor et al. protocol, which seems
significantly simpler and more intuitive than the original one. Moreover, the new proof achieves much better parameters (in terms of
how security preserving the reduction is). Finally, by applying our new proof to a close variant of the Naor et al. protocol, we achieve a more
versatile interactive hashing theorem for a more general setting than that of the Naor et al. protocol.
Iftach Haitner, Danny Harnik, and Omer Reingold
On the Power of the Randomized Iterate
[slides]
SIAM Journal of Computing 2011 [link]
Advanced in Cryptology - CRYPTO, 2006 [pdf]
Draft of full version [pdf]
We consider two of the most fundamental theorems in Cryptography. The first, due to Håstad, Impagliazzo, Levin and Luby [STOC '89, STOC '90,
SIAM J. on Computing '99], is that pseudorandom generators can be constructed from any one-way function. The second, due to Yao [FOCS
'82], states that the existence of weak one-way functions implies the
existence of full fledged one-way functions. These powerful plausibility results shape our understanding of hardness and randomness
in Cryptography, but unfortunately their proofs are not as tight (i.e., security preserving) as one may desire.
This work revisits a technique that we call the randomized
iterate, introduced by Goldreich, Krawczyk and Luby [SIAM J.
on Computing '93]. This technique was used by Goldreich et al. to give a construction of pseudorandom generators from regular one-way
functions. We simplify and strengthen this technique in order to obtain a similar construction where the seed length of the resulting
generators is as short as Θ(n log n) rather than Θ(n^{3})
achieved by Goldreich et al. Our technique has the potential of implying seed-length Θ(n),
and the only bottleneck for such a result is the parameters of current generators against space bounded computations. We give a construction
with similar parameters for security amplification of regular one-way functions. This improves upon the construction of Goldreich,
Impagliazzo, Levin, Venkatesan and Zuckerman [FOCS '90] in that the
construction does not need to ``know" the regularity parameter of the functions (in terms of security, the two reductions are incomparable).
In addition, we use the randomized iterate to show a construction of a pseudorandom generator based on an exponentially hard one-way function
that has seed length of only Θ(n^{2}). This improves a recent result of Holenstein [TCC '06] that shows a
construction with seed length
Iftach Haitner, Danny Harnik
and Omer Reingold
On EfficientPseudorandom Generators from Exponentially Hard One-Way Functions
[slides]
International Colloquium on Automata, Languages and Programming - ICALP, 2006 [pdf]
Full version is part of On the Power of the Randomized Iterate
We show a construction of a pseudorandom generator from any exponentially hard one-way function with a blowup of only Θ(n^{2}). This Improves the recent Θ(n^{5}) construction of Holenstein [TCC '06]. Our technique uses the tools recently presented in Haitner et al. [Crypto '06] for the setting of regular one-way functions, and further develops them.
Iftach Haitner, Omer Horvitz, Jonathan Katz, Chiu-Yuen Koo, Ruggero Morselli and Ronen Shaltiel
Reducing Complexity Assumptions for Statistically-Hiding Commitment
[slides]
Journal of Cryptology 2009 [link]
Advances in Cryptology - Eurocrypt, 2005 [pdf]
Draft of full version [pdf]
We revisit the following question: what are the minimal
assumptions needed to construct statistically-hiding commitment schemes?
Naor et al. [J. Cryptology 98] show how to construct such schemes based on any one-way
permutation. We improve upon this by showing a construction based on any approximable preimage-size one-way function.
These are one-way functions for which it is possible to efficiently approximate the number of pre-images of a given output. A special case
is the class of regular one-way functions where all points in the image of the function have the same (known) number of
pre-images.
We also prove two additional results related to statistically-hiding commitment. First, we prove a (folklore)
showing, roughly speaking, that the statistical hiding property of any such commitment scheme is amplified exponentially when multiple independent parallel
executions of the scheme are carried out. Second, we show a compiler which transforms any commitment scheme which is statistically hiding
against an honest-but-curious receiver into one which is statistically hiding even against a malicious receiver.
Iftach Haitner
Implementing Oblivious Transfer Using a Collection of Dense Trapdoor Permutations
[slides]
Theory of Cryptography Conference (TCC), 2004 [pdf]
Draft of full version [pdf]
Until recently, the existence of collection of trapdoor permutations (TDP) was believed (and claimed) to imply almost all of the
major cryptographic primitives, including public-key encryption (PKE), oblivious transfer (OT), and non-interactive zero-knowledge (NIZK). It
was recently realized, however, that the commonly accepted general definition of TDP needs to be strengthened slightly in order to make the
security proofs of TDP-based OT go through. We present an implementation of oblivious transfer based on collection of dense trapdoor permutations.
The latter is a collection of trapdoor permutations, with the property that the permutation domains are polynomially dense in the
set of all strings of a particular length. Previous TDP-based implemen tations of oblivious transfer assumed an enhancement of the hardness
assumption (of the collection).
Here we present an alternative construction that only requires the TDP to have dense domains. Specically we present
an implementation of OT based on any dense TDP.