The research institute has set up a website. Here it is!

Lots of interesting information on the site. You can find out about its awesome D-Wave Two computer, its goals, and the people involved. Check it out.

The research institute has set up a website. Here it is!

Lots of interesting information on the site. You can find out about its awesome D-Wave Two computer, its goals, and the people involved. Check it out.

%d bloggers like this:

Geordie, the black and white icon for the team gotta go. It looks like a horde of zombies. I don’t think that website was vetted by an artiste like Suzanne.

What strikes me is how cyclical our progress is … we moved from analog to digital processing in all areas of computation! Now with the paradigm of Adiabatic Quantum Computation becoming a reality we are probably witnessing a new super cycle of analog computing!!

All the best to the Dwave and QUAIL team..

I don’t think P = NP. Don’t see what all the big fuss is about as many of the important problems are just P, and some may be quantum computers that can solve a problem before classical computers can define it. In any event, I know why Grover’s Algorithm only speeds up or scales or whatever, to the square root.

Quantum computers have only two end states (not counting superposition). Up or down, etc. And the cancelling out process doesn’t cancel out three or more qbits at a time. If there were three end states and cancelling out 3 qbits, the algorithm could be more powerful.

The NP problems are basically, can basically, be viewed as being in three dimensions. The way 3d blocks can be packed in a volume, obviously. Q computers function only in two dimensions. But the painting 3 colours on a map problem, also adds at least one dimension; it least squares one of the 2d dimensions by forcing a tiling to flat infinity of colours when you start to colour the map. The one path salesman problem is also another dimension in that it is recursive. If there is a 3d or more quantum computer, I don’t think we get there with this 21st century GUT-shortage or near-term materials science.

we are in a 4d universe with only a 3d perspective.

Much-hyped “quantum-computing” is alive and well and has been for three-decades!!!

Siegel-Rosen-Feynman-Smith-Marinov [IBM Conference on Computers and Mathematics, Stanford(1986)] working in artificial neural-network(ANN) A.I. for both Rosen’s Machine-Intelligence(Atherton) and Marinov’s Exxon Enterprises/A.I.(Santa Clara) following Siegel(1980) noticed that the by-rote on-node sigmoid switching-function is just plain wrong, requiring large space and time computing-resources: the Boltzmann-machine(BM) and simulated-annealing(SA). Wny? Because as Siegel(1980) discovered trivially that the by-rote sigmoid-function

1/[1 + e^(E/T)] = 1/[+ 1 + e^(E/T)] =

1/[e^(E/T) + 1] = Fermi-Dirac quantum-statistics(FDQS) totally dominating ANN A.I. is WRONG? Why? Simply because FDQS fermion-repulsion means the Pauli exclusion-principle and Hund’s-rule spin-pairing, effectively trapping the ANN in non-optimal local-minima necessitating space and time costly BM + SA. Alternatively Siegel reasoned was quantum-statistics transmutation from non-optimal local-minimum/BM + SA encumbered FDQS 1/[e^(E/T) + 1] to Bose-Einstein quantum-statistics(BEQS)

1/[e^(E/T) - 1] with NO non-optimal local-minima possible because bosons attract (versus fermions repel). This was called the Bose-Einstein machine or Bose-Einstein condensation(BEC) machine!

Thus much-hyped “quantum-computing” via quantum statistical-mechanics/physics is alive and well and has been in ANN A.I. for three decades now!!! As well Siegel trivial proof that P =/= NP[reference: google "Edward Siegel P = NP] via Menger [Dimensiontheorie, Teubner(1928)] dimension-theory and Siegel FUZZYICS=CATEGORYICS=PRAGMATYICS/ category-semantics cognition is decades old!!!:

P=/=NP Category-Semantics(C-S) TRIVIAL Proof: EUCLID!!! [(So Miscalled) ``Computational''-``Complexity"(CC) Jargonial-Obfuscation(J-O); (Which???)MillenniumED-ProblemED(M-P): NO CC, "CS" Feet of Clay!!!]

Edward Siegel , London Clay

P=/=NP M-P proof is by C-S J-O elimination! C-S P=(?)=NP MEANS (Deterministic).(P-C)=(?)=(NON-D).(P-C)=(NP). C-S P=(?)=NP MEANS (Deterministic). (P-C)=(?)=(Non-D).(P-C) i.e. D.(P)=(?)= N.(P). For inclusion(equality) vs. exclusion(inequality), ir-relevant(P)simply cancels! (Equally any other CC IF both sides identical). Crucial question left(D)=(?)=(N-D), i.e. D =(?)= N. Algorithmics: Deterministic (D) serial vs. Non-deterministic (N) NON-serial, branch fork forms a triangle, its vertices a plane. Menger Dimension-Theory: Dimensionality: D serial is one-dimensional, dim(D) = 1 (definition), versus N non-serial is $>$ one-dimensional, dim(N) = 2(branching; fork; triangle; plane)+ E(probabilistic)$>$ 2 [Sipser [Intro. to Thy. of Comp., PWS Pub. Co.(1997)-p. 49; Fig. 1.15!!!]]. Hence(Euclid[-300 BCE])by simple formative geometry, dim(D) = 1 =/= dim(N) = 2(branching)+ E(probabilistic) $>$ 2, Left-to-Right INclusion VERSUS Right-to-Left EXclusion. Hence P =/= NP!!! QED, i.e. D =/= N, i.e. dim(D) = 1 =/= dim(N) = 2(branching)+ E(probabilistic)$>$ 2 by first millennium BCE, before “CS” J-O of CC!!! Harder proofs, but still amenable to FUZZYICS C-S J-O analysis, are any combinations with DIS-similar CCs, especially LHS combining D with low CC and/or RHS combining N with high CC!

Much-hyped “quantum-computing” is alive and well and has been for three-decades!!!

Siegel-Rosen-Feynman-Smith-Marinov [IBM Conference on Computers and Mathematics, Stanford(1986)] working in artificial neural-network(ANN) A.I. for both Rosen’s Machine-Intelligence(Atherton) and Marinov’s Exxon Enterprises/A.I.(Santa Clara) following Siegel(1980) noticed that the by-rote on-node sigmoid switching-function is just plain wrong, requiring large space and time computing-resources: the Boltzmann-machine(BM) and simulated-annealing(SA). Wny? Because as Siegel(1980) discovered trivially that the by-rote sigmoid-function

1/[1 + e^(E/T)] = 1/[+ 1 + e^(E/T)] =

1/[e^(E/T) + 1] = Fermi-Dirac quantum-statistics(FDQS) totally dominating ANN A.I. is WRONG? Why? Simply because FDQS fermion-repulsion means the Pauli exclusion-principle and Hund’s-rule spin-pairing, effectively trapping the ANN in non-optimal local-minima necessitating space and time costly BM + SA. Alternatively Siegel reasoned was quantum-statistics transmutation from non-optimal local-minimum/BM + SA encumbered FDQS 1/[e^(E/T) + 1] to Bose-Einstein quantum-statistics(BEQS)

1/[e^(E/T) - 1] with NO non-optimal local-minima possible because bosons attract (versus fermions repel). This was called the Bose-Einstein machine or Bose-Einstein condensation(BEC) machine!

Thus much-hyped “quantum-computing” via quantum statistical-mechanics/physics is alive and well and has been in ANN A.I. for three decades now!!! As well Siegel trivial proof that P =/= NP[reference: google "Edward Siegel P = NP] via Menger [Dimensiontheorie, Teubner(1928)] dimension-theory and Siegel FUZZYICS=CATEGORYICS=PRAGMATYICS/ category-semantics cognition is decades old!!!:

P=/=NP Category-Semantics(C-S) TRIVIAL Proof: EUCLID!!! [(So Miscalled) ``Computational''-``Complexity"(CC) Jargonial-Obfuscation(J-O); (Which???)MillenniumED-ProblemED(M-P): NO CC, "CS" Feet of Clay!!!]

Edward Siegel , London Clay

P=/=NP M-P proof is by C-S J-O elimination! C-S P=(?)=NP MEANS (Deterministic).(P-C)=(?)=(NON-D).(P-C)=(NP). C-S P=(?)=NP MEANS (Deterministic). (P-C)=(?)=(Non-D).(P-C) i.e. D.(P)=(?)= N.(P). For inclusion(equality) vs. exclusion(inequality), ir-relevant(P)simply cancels! (Equally any other CC IF both sides identical). Crucial question left(D)=(?)=(N-D), i.e. D =(?)= N. Algorithmics: Deterministic (D) serial vs. Non-deterministic (N) NON-serial, branch fork forms a triangle, its vertices a plane. Menger Dimension-Theory: Dimensionality: D serial is one-dimensional, dim(D) = 1 (definition), versus N non-serial is $>$ one-dimensional, dim(N) = 2(branching; fork; triangle; plane)+ E(probabilistic)$>$ 2 [Sipser [Intro. to Thy. of Comp., PWS Pub. Co.(1997)-p. 49; Fig. 1.15!!!]]. Hence(Euclid[-300 BCE])by simple formative geometry, dim(D) = 1 =/= dim(N) = 2(branching)+ E(probabilistic) $>$ 2, Left-to-Right INclusion VERSUS Right-to-Left EXclusion. Hence P =/= NP!!! QED, i.e. D =/= N, i.e. dim(D) = 1 =/= dim(N) = 2(branching)+ E(probabilistic)$>$ 2 by first millennium BCE, before “CS” J-O of CC!!! Harder proofs, but still amenable to FUZZYICS C-S J-O analysis, are any combinations with DIS-similar CCs, especially LHS combining D with low CC and/or RHS combining N with high CC!

Dr. Edward Siegel “physical-mathematicist”

categorysemantics@gmail.com

(206) 659-0235

I think you can use Grover’s Algorithm to enable nealry real-time surveillence of future WMDs from sensor network datbases such as video libraries. The user interface and the operation of computers themselves can be captured by digital video cameras and turned into zeroes and ones. Certain video images will correlate with someone doing R+D or operating a computer potentially to enable a WMD like AI or a chemical poison or super intelligent brain implant. It will take some time to establish a list of what video images (or other sensor media such as EEG) are potentially a WMD/tyranny threat. Tough not to slow down beneficial research. Such a technology could enable 1984. But it can also form a safe from of surveillence as there is no human actually looking at the cameras; we keep our privacy. Actuarying the list will be tough. The conept is to search items on a WMD list one by one through near real-time sensor datastream or recent libraries. When a match comes up, say someone using a big computer or someone powerful going mentally ill, police and military and rapid response can occur quickly.