Диетическое и лечебное питание при различных заболеваниях.

диеты, как похудеть, упражнения, фитнес, красивая фигура, гадания.

27 май 2011 Диета Пьера Дюкана во Франции неимоверно популярна. Но не все так розово, Диеты бывают разные – гипокалорийные, белковые и… счастливые.

Диетическое питание. Средства для похудения, диеты, похудеть быстро. Советы и рекомендации. ]]>

bound you got, by simply sieving an array that many bits long. You need the RH to assure that any this-size region will contain a prime, however. All multiples of numbers up to this bound, are removed by the sieving.

Well, you probably already thought of that. (I don’t know how to use this blog thing to tell what happened before. Where’s the how-to guide?) Sure is low-tech.

As another comment, I’m not sure this whole topic has much interest. To explain, we have easy randomized algorithms that will do the job in polynomial(k) time with high probability. These algorithms are very very very simple. Any deterministic algorithm you invent is almost certainly, by comparison, going to be utterly worthless, way more complicated and slower. That’s the first problem. The second problem is, suppose I derandomize that algorithm by simply using certain deterministic pseudorandom number generators (I have some which I can prove, starting from a random seed, will pass every possible polytime statistical test provided that ANY generator exists that does… furthermore I can prove they are nearly as immune to randomness tests as it is possible for any algorithm to be…). So, in that case, I’ve solved your problem OR there exists no such pseudo-random number generator. So in the event I haven’t solved your problem, a way, way, way more important and profound result is at hand. And very weak randomness properties are required in your application.

So all this suggests to me, that this was a fairly stupid problem to choose.

Mind you, I won’t say it’s totally stupid since I admit it would feel interesting to know the answer to your problem. It’s kind of like the AKS primality test was allegedly a great development… and yes it was in some sense… but from a practical point of view, it is a total waste of time.

My final comment is: are you aware of the following papers?

Janos Pintz, William L. Steiger, Endre Szemeredi: Two Infinite Sets of Primes with Fast Primality Tests,

STOC 1988 pp 504-509, journal version is

Maths of Computation Vol. 53, No. 187, July 1989 pp.399-406.

Eric Bach: How to Generate Factored Random Numbers, SIAM J. Computing 17 (1988) pp.179-193

These were done pre-AKS, but are better than AKS for many purposes of your ilk.

]]>http://qwiki.stanford.edu/wiki/Complexity_Zoo:F

I think there should also be a FBPP but I can’t find it.

I think the problem is is FBPQ. We are given 10^k to 10^k+1

with a quantum number computer we have random numbers and can guess randomly each guess and test is polynomial in k

and and has probability of success 1/k so if we repeat this test 2k times we should have probabliity of success 1-e^2

and we should have the problem in FBQP. This looks better than any search involving z^.49 because I think z is exponential in k. If FBPP were defined as FBPQ then something similar should show the problem is in FBPP.

,

where is our generating function for the primes in , and where the and are polynomials such that the total number of terms among all of them is at most, say, or so. What we would like to do is quickly evaluate at, say,

,

where or so; and, we’d like to be able to do this using a lot fewer than operations.

Now let be some integer (that we choose later), and write each of the as

,

where

and

,

where are the coefficients of ; and, write similarly — that is,

Ideally, we want to be such that each have about half as many terms as , and we want the analogous to hold for the — of course, in order to guarantee that this is possible, we would have to be able to select that decomposition for above carefully.

Now, assuming that we can find such a (and have such a decomposition for ), we then observe the following, which is basically Karatsuba’s identity:

,

where

,

,

.

The point here is that we have replaced on sum of products of two polynomials (the ) with three sums of such products, but where each polynomial has degree about half what we had before — each of these sums (forgetting the factors and so on) involves products of two polynomials of degree at most . The idea is then to iterate the above process, starting with *these* sums, and then replacing each by three sums of products of polynomials each of degree , and so on. Eventually, we get down to sums of products of polynomials that we can just expand out by trivial methods, as they will have few terms to begin with, and then apply FFTs to evaluate them at the points (*) above.

Unfortunately, it is not always the case that the new polynomials produced at each iteration have fewer terms, since the and may have been sparse to begin with (in fact, likely are). But what we can hope for is that the initial polynomials and can be chosen carefully so that after a small number of iterations, we get “mixing” — that is, the number of terms in, say, each of the polynomials

is not much bigger than the number in each of

.

In other words, we will want that sufficiently many iterations into the above process we have that, say, shares many terms in common with , and the same for the ‘s.

]]>A relevant decision problem, though, is “Does there exist a prime in the interval [a,b]?”. If one can solve this decision problem quickly, one can solve the search problem quickly, by a binary search starting from [x,2x].

It is still open whether this decision problem is in BPP. The problem is that the density of primes in [a,b] could be very low, and so it is not clear even after polylogarithmically many searches that one has a 2/3 chance or more of finding a prime (the needle in the haystack problem). But it seems from Ernie’s work that this problem is at least in BP- when , by which I mean that there is a probabilistic algorithm which after work will correctly determine whether there is a prime or not in [a,b] with a failure probability of at most 1/3 in either case.

]]>In fact from

http://en.wikipedia.org/wiki/BQP

BQP contains BPP

Furthermore since we have a source of randomness in BPP and the density is 1/log b for primes then using the random source to get a random candidate will give a number that has 1/log b probability of being prime repeating this 2log b times will give success with probability 1-(1-1/log b)^(2log b) or roughly 1-(e^-2) which means probability of failure is less than 1/3. I may be missing seeing something but it looks like the problem is in both BQP and BPP.

]]>Actually, now that I think about it, we won’t get BPP, but rather BP-, which isn’t nearly as impressive, but still non-trivial at least…

]]>