All Submission Categories

1208 Submissions

[218] viXra:1208.0245 [pdf] replaced on 2013-11-16 10:16:33

The Arithmetic of Binary Representations of Even Positive Integer 2n and Its Application to the Solution of the Goldbach's Binary Problem

Authors: Alexander Fedorov
Comments: 50 Pages.

One of causes why Goldbach's binary problem was unsolved over a long period is that binary representations of even integer 2n (BR2n) in the view of a sum of two odd primes(VSTOP) are considered separately from other BR2n. By purpose of this work is research of connections between different types of BR2n. For realization of this purpose by author was developed the "Arithmetic of binary representations of even positive integer 2n" (ABR2n). In ABR2n are defined four types BR2n. As shown in ABR2n all types BR2n are connected with each other by relations which represent distribution of prime and composite positive integers less than 2n between them. On the basis of this relations (axioms ABR2n) are deduced formulas for computation of the number of BR2n (NBR2n) for each types. In ABR2n also is defined and computed Average value of the number of binary sums are formed from odd prime and composite positive integers $ < 2n $ (AVNBS). Separately AVNBS for prime and AVNBS for composite positive integers. We also deduced formulas for computation of deviation NBR2n from AVNBS. It was shown that if $n$ go to infinity then NBR2n go to AVNBS that permit to apply formulas for AVNBS to computation of NBR2n. At the end is produced the proof of the Goldbach's binary problem with help of ABR2n. For it apply method of a proof by contradiction in which we make an assumption that for any 2n not exist BR2n in the VSTOP then make computations at this conditions then we come to contradiction. Hence our assumption is false and forall $2n > 2$ exist BR2n in the VSTOP.
Category: Number Theory

[217] viXra:1208.0244 [pdf] submitted on 2012-08-31 13:12:22

A New Conjecture Towards the Proof of the Hodge Conjecture

Authors: Jaivir S. Baweja
Comments: 3 Pages.

In this paper, we review important facts related to the Hodge conjecture. Also, we review Chow classes and their importance to the problem. At the end of this survey, we pose a new conjecture that would advance work on it if proven true, to further the development of the important Millennium prize problem
Category: General Mathematics

[216] viXra:1208.0243 [pdf] replaced on 2012-09-08 16:11:53

The “HIGGS” from Anti-Neutrons and Neutrons

Authors: Glenn A. Baxter
Comments: Eleven pages

The new anti-neutron, first proposed in 2011 [16], and ordinary neutrons, are both without electric charge, and are therefore rather difficult to accelerate and/or impact together in a high energy collider sufficient to cause annihilation, and thus create a 100% energy “Higgs” styled Boson, such Bosons arguably representing all of the mass in the universe. This very high energy impact combination was first suggested during conversation, publicly, by U.K. chemist R. Guy Grantham, MRSC, on 25 August 2011. [18] This writer had proposed, previously [16], a simple universal theory/model of the atom composed of anti-neutrons, electrons, positrons, and neutrinos, which better explains fusion, fission, radioactivity, electromagnetic radiation, gravity, electric force, magnetic force, and the strong force. Dr. D. Sasso’s recent formalisms for electrons, positrons, and photons [19] are adopted herein to better describe the author’s original 2011 anti-neutron model of the atom. [16]
Category: Nuclear and Atomic Physics

[215] viXra:1208.0242 [pdf] replaced on 2013-08-18 16:01:48

Constructive Motives and Scattering

Authors: M. D. Sheppeard
Comments: 165 Pages. WARNING: written under extreme stress - ignore prior drafts

This elementary text is for anyone interested in combinatorial methods in modern axiomatic particle physics. It focuses on the role of knots in motivic arithmetic, and the connection to category theoretic structures. Phenomenological aspects of rest mass quantum numbers are also discussed.
Category: Quantum Gravity and String Theory

[214] viXra:1208.0241 [pdf] submitted on 2012-08-31 09:25:56

New Insight Into Classical Electrodynamics

Authors: A.N. Grigor'ev
Comments: 8 Pages.

It is demonstrated that the magnetic field surrounding a chain of uniformly moving charges does not constitute a sum of magnetic fields of single charges but is a result of interaction between a test charge and all chain charges. A single moving charge is not surrounded by any magnetic field. Electromagnetic radiation and inductive effect are considered with the assumption that there is no magnetic field at all. It is pointed out that the classical electrodynamics can be based on the electric field formula of the charge moving arbitrarily.
Category: Relativity and Cosmology

[213] viXra:1208.0239 [pdf] replaced on 2014-10-12 15:47:23

Gravitational Blueshift and Redshift Generated at Laboratory Scale

Authors: Fran De Aquino
Comments: 8 Pages.

In this paper we show that it is possible to produce gravitational blueshift and redshift at laboratory scale by means of a device that can strongly intensify the local gravitational potential. Thus, by using this device, it is possible to generate electromagnetic radiation of any frequency, from ELF radiation (f < 10Hz) up to high energy gamma-rays. In this case, several uses, such as medical imaging, radiotherapy and radioisotope production for PET (positron emission tomography) scanning, could be realized. The device is smaller and less costly than conventional sources of gamma rays.
Category: Relativity and Cosmology

[212] viXra:1208.0237 [pdf] submitted on 2012-08-30 04:32:08

Can Differentiable Description of Physical Reality be Considered Complete? :toward a Complete Theory of Relativity

Authors: Xiong Wang
Comments: 15 Pages.

How to relate the physical \emph{real} reality with the logical \emph{true} abstract mathematics concepts is nothing but pure postulate. The most basic postulates of physics are by using what kind of mathematics to describe the most fundamental concepts of physics. Main point of relativity theories is to remove incorrect and simplify the assumptions about the nature of space-time. There are plentiful bonus of doing so, for example gravity emerges as natural consequence of curvature of spacetime. We argue that the Einstein version of general relativity is not complete, since it can't explain quantum phenomenon. If we want to reconcile quantum, we should give up one implicit assumption we tend to forget: the differentiability. What would be the benefits of these changes? It has many surprising consequences. We show that the weird uncertainty principle and non-commutativity become straightforward in the circumstances of non-differentiable functions. It's just the result of the divergence of usual definition of \emph{velocity}. All weirdness of quantum mechanics are due to we are trying to making sense of nonsense. Finally, we proposed a complete relativity theory in which the spacetime are non-differentiable manifold, and physical law takes the same mathematical form in all coordinate systems, under arbitrary differentiable or non-differentiable coordinate transformations. Quantum phenomenon emerges as natural consequence of non-differentiability of spacetime.
Category: Mathematical Physics

[211] viXra:1208.0236 [pdf] submitted on 2012-08-30 08:52:23

Hierarchy of Theories of Unified Gravity and Yynamics at the Neighborhood of Several Gravitational Field Sources. Part II.

Authors: Akindele O. Adekugbe Joseph
Comments: 29 Pages. An article in volume one of THE FUNDAMENTAL THEORY... monograph series on an evolving fundamental theory of physics on a many-world background, starting from a complete theory of relativity and gravitation in four-world.

Corresponding to the special theory of relativity/intrinsic special theory of relativity (SR/φSR) and the theory of gravitational relativity/intrinsic intrinsic theory of gravitational relativity (TGR/φTGR) on flat four-dimensional spacetime/flat two-dimensional intrinsic spacetime, and the metric theory of absolute intrinsic motion (φMAM) and the metric theory of absolute intrinsic gravity (φMAG) on curved ‘two-dimensional’ absolute intrinsic spacetime, at the second stage of evolutions of spacetime/intrinsic spacetime and parameters/intrinsic parameters in every gravitational field, there are unified SR/φSR and TGR/φTGR on flat four-dimensional spacetime/flat two-dimensional intrinsic spacetime, denoted by SR/φSR+TGR/φTGR, and unified φMAM and φMAG on curved ‘two-dimensional’ absolute intrinsic spacetime, denoted by φMAM+φMAG. These unified theories are accomplished in this article for two cases of, (i) a test particle is in motion at a large velocity relative to an observer at the neighborhood of one, two and several gravitational field sources, and (ii) a gravitational field source, such as a massive star or a neutron star, is in motion at a large velocity relative to an observer in a region of space that is devoid of the gravitational field of any other source. These entail essentially the incorporation of the velocity v of relative motion into the results of TGR and of absolute intrinsic dynamical speed φVd of absolute intrinsic motion into the absolute intrinsic metric tensor φgik of φMAG, developed in the previous articles for the two situations. It is shown that the existing special theory of relativity, referred to as Lorentz-Einstein-Minkowski special relativity (LEM) in this article, is valid strictly for the relative motion of the electron or its anti-particle.
Category: Relativity and Cosmology

[210] viXra:1208.0235 [pdf] submitted on 2012-08-30 09:31:38

The Navigation Mobile Robot Systems Using Bayesian Approach Through the Virtual Projection Method

Authors: Luige Vladareanu, Gabriela Tont, Victor Vladareanu, Florentin Smarandache, Lucian Capitanu
Comments: 6 Pages.

The paper presents the navigation mobile walking robot systems for movement in non-stationary and non-structured environments, using a Bayesian approach of Simultaneous Localization and Mapping (SLAM) for avoiding obstacles and dynamical stability control for motion on rough terrain. By processing inertial information of force, torque, tilting and wireless sensor networks (WSN) an intelligent high level algorithm is implementing using the virtual projection method. The control system architecture for the dynamic robot walking is presented in correlation with a stochastic model of assessing system probability of unidirectional or bidirectional transition states, applying the non-homogeneous/non-stationary Markov chains. The rationality and validity of the proposed model are demonstrated via an example of quantitative assessment of states probabilities of an autonomous robot. The results show that the proposed new navigation strategy of the mobile robot using Bayesian approach walking robot control systems for going around obstacles has increased the robot’s mobility and stability in workspace.
Category: Artificial Intelligence

[209] viXra:1208.0233 [pdf] replaced on 2012-08-29 15:26:47

The Second Project for the 21-th Century Student: Let us Try to Use Schrödinger's Hypothesis About the Role of Neg-Entropy for Life to Help in Solving the Problem of Hooliganism

Authors: Emanuel Gluskin
Comments: The formal target of the project is simple, but the statistical data to be obtained should be very important. The work is on a junction of biology, sociology and system theory. 10 pages, 2 Figures.

This is a second (after [1]) suggestion of a topic for a project for a student who wishes to look forward in the wide thematic (here a biological measuring) of modern electronics. The idea is motivated by brochure [2] by Erwin Schrödinger, connecting life with negative "entropy". Not having as pure a psychological background as in [1], but extending the idea of [2] the present work touches the psychology of personality in the sense of a collective societal influence. The point is to fight against hooliganism in its modern organized version expressed in the days of violence, via the correct understanding of the causes of this very serious social phenomenon, and, consequently, to develop a just attitude to the defined "hooligans". The project's target is to check the state of the brain activity of different humans when they are receiving the "typical" information from radio or TV, and to develop relevant recommendations for the editors of the news.
Category: Mind Science

[208] viXra:1208.0232 [pdf] submitted on 2012-08-29 01:32:26

New Developments: The Big Bang - in Controversy

Authors: Roger A. Rydin, Stephen J. Crothers
Comments: 12 Pages. This paper was written on 8 October, 2009.

Recall that Einstein's fiendishly complex equations of gravity can be solved exactly only if we assume that the Universe on the large-scale is homogeneous - that is, it looks the same from every place. This assumption, enshrined in the Cosmological Principle, leads to the Friedman-Robertson-Walker solutions: the Big Bang models. Abandon that assumption and everything we thought we knew about the Universe gets jettisoned, as New Scientist has pointed out (21 August 1999, p 22).
Category: Relativity and Cosmology

[207] viXra:1208.0231 [pdf] submitted on 2012-08-29 05:54:02

A Presently Overlooked Explanation for the Constant Speed of Light, Relativistic Effects and Magnetic Force

Authors: Steffen Kühn
Comments: 18 Pages.

This study discusses a new, unusual but simple explanation for the constant speed of light, some relativistic effects and magnetic force. The starting point of the discussion is a single, plausible postulate which links the principle of relativity with a constant speed of light for all inertial observers without space and time assumptions. The postulate makes it possible to further explain the effect of time dilation, relativistic Doppler effect and Lorentz force. This study also shows that the concept leads almost, but not exactly to the special theory of relativity. It is surprising that this approach has not been discussed to date. Because of its high interpretive power, the conclusion is that the hypothesis remains completely unexplored.
Category: Relativity and Cosmology

[206] viXra:1208.0229 [pdf] submitted on 2012-08-29 00:56:32

A Short Discussion of Relativistic Geometry

Authors: Stephen J. Crothers
Comments: 7 Pages. Published: Bulletin of Pure and Applied Sciences.Vol.24E(No.2)2005:P.267-273

The relativists have not understood the geometry of Einstein’s gravitational field. They have failed to realise that the geometrical structure of spacetime manifests in the geometrical relations between the components of the metric tensor. Consequently, they have foisted upon spacetime quantities and geometrical relations which do not belong to it, producing thereby, grotesque objects, not due to Nature, but instead, to faulty thinking. The correct geometry and its consequences are described herein.
Category: Relativity and Cosmology

[205] viXra:1208.0228 [pdf] submitted on 2012-08-29 01:00:24

On the Alleged ‘Supermassive Black Hole’ in the Bright Quasar 3C 279 (An Open Letter to Weintroub et al.)

Authors: Stephen J. Crothers
Comments: 3 Pages.

On the 18th July 2012 Weintroub et al. reported the alleged discovery of a supermassive black hole in the bright Quasar 3C 279 in the following online article ‘APEX takes part in sharpest observation ever’ 18-Jul-2012, (http://www.eurekalert.org/pub_releases/2012-07/e-atp071612.php). On the 19th July 2012 this Open Letter was forwarded to the scientists who jointly reported this. It explains, in very simple physical terms, why it is impossible for there to be a black hole in Quasar 3C 279.
Category: Relativity and Cosmology

[204] viXra:1208.0227 [pdf] submitted on 2012-08-28 07:10:24

Hierarchy of Theories of Unified Gravity and Dynamics at the Neighborhood of Several Gravitational Field Sources. Part I

Authors: Akindele O. Adekugbe Joseph
Comments: 42 Pages. An article in volume one of THE FUNDAMENTAL THEORY...monograph series on an evolving fundamental theory of physics with a complete theory of relativity and gravitation in four-world as its foundation.

The two-theory approach to gravitation at the second stage of evolutions of spacetime/intrinsic spacetime and parameters/intrinsic parameters in a gravitational field of arbitrary strength, comprising of the theory of gravitational relativity/intrinsic theory of gravitational relativity (TGR/øTGR) on flat spacetime/flat intrinsic spacetime and the metric theory of absolute intrinsic gravity (øMAG) on curved absolute intrinsic spacetime, isolated at the neighborhood of one gravitational field source in the earlier articles, is advanced to the situations where two, three and several gravitational field sources are scattered in the Euclidean 3-space about a location where the theories are formulated. Gravitational time dilation, gravitational length contraction and assemblage of parameter transformations in the context of TGR, are extended to the neighborhood of several gravitational field sources. Extension of TGR to the situation where a number N of gravitational field sources are interacting (the N-body problem), is accomplished for N=2 and N=3 and shown to admit of straight forward extension to larger values of N, except that it becomes increasingly cumbersome as N increases beyond 4. On the other hand, øMAG admits of easy and straight forward extension to the N-body problem for any value of N. Einstein's principle of equivalence is validated in the context of TGR at the neighborhood of any number of gravitational field sources, from which its universal validity follows.
Category: Relativity and Cosmology

[203] viXra:1208.0226 [pdf] replaced on 2012-12-04 09:25:42

Complex Noise-Bits and Large-Scale Instantaneous Parallel Operations with Low Complexity

Authors: He Wen, Laszlo B. Kish, Andreas Klappenecker
Comments: 10 Pages. In press at Fluctuation and Noise Letters

We introduce the complex noise-bit as information carrier, which requires noise signals in two parallel wires instead of the single-wire representations of noise-based logic discussed so far. The immediate advantage of this new scheme is that, when we use random telegraph waves as noise carrier, the superposition of the first 2^N integer numbers (obtained by the Achilles heel operation) yields non-zero values. We introduce basic instantaneous operations, with O(2^0) time and hardware complexity, including bit-value measurements in product states, single-bit and two-bit noise gates (universality exists) that can instantaneously operate over large superpositions with full parallelism. We envision the possibility of implementing instantaneously running quantum algorithms on classical computers while using similar number of classical bits as the number of quantum bits emulated without the necessity of error corrections. Mathematical analysis and proofs are given.
Category: Data Structures and Algorithms

[202] viXra:1208.0223 [pdf] submitted on 2012-08-26 22:40:04

De Bruijn's Combinatorics

Authors: J.W.Nienhuys (Ling-Ju Hung, Ton Kloks eds.)
Comments: 192 Pages.

This is a translation of the handwritten classroom notes taken by Nienhuys of a course in combinatorics given by N.G. de Bruijn at Eindhoven University of Technology, during the 1970s and 1980s.
Category: Combinatorics and Graph Theory

[201] viXra:1208.0222 [pdf] submitted on 2012-08-27 08:11:04

Proof of the SYZ Conjecture

Authors: Jaivir S.Baweja
Comments: 5 Pages.

In this short paper, we prove that the Strominger-Yau-Zaslow (SYZ) conjecture holds by showing that mirror symmetry is equivalent to T- duality under fibrations from Lagrangian tori. In order to do this, we use some recent developments on Ooguri- Vafa spaces to construct such fibers. Moreover, we show that this is only possible under the trivial vector bundle {0}, thus giving an equivalence between the triangulated categories D^b Fuk_0 (Y,ω) and D_0^b (Y ̌).
Category: General Mathematics

[200] viXra:1208.0221 [pdf] replaced on 2014-10-12 15:45:20

Artificial Gravitational Lenses

Authors: Fran De Aquino
Comments: 9 Pages.

We show that it is possible to produce gravitational lenses at laboratory scale by means of a toroidal device which strongly intensifies the radial gravitational acceleration at its nucleus, and can make the acceleration repulsive besides attractive. This means that a light flux through the toroid can become convergent or divergent from its central axis. These lenses are similar to optical lenses and can be very useful for telescopes, microscopes, and for the concentration of solar light in order to convert solar energy into thermal energy.
Category: Relativity and Cosmology

[199] viXra:1208.0220 [pdf] submitted on 2012-08-26 02:41:14

Clifford Space Gravitational Field Equations and Dark Energy

Authors: Carlos Castro
Comments: 14 Pages. submitted to J. Phys A : Mathematical and Theoretical

We continue with the study of Clifford-space Gravity and analyze further the Clifford space ($ C$-space) generalized gravitational field equations which are obtained from a variational principle based on the generalization of the Einstein-Hilbert-Cartan action. One of the main features is that the $C$-space connection requires $torsion$ in order to have consistency with the Clifford algebraic structure associated with the curved $C$-space basis generators. Hence no spin matter is required to induce torsion since it already exists in the vacuum. The field equations in $C$-spaces associated to a Clifford algebra in $D$-dimensions are $not$ equivalent to the ordinary gravitational equations with torsion in higher $2^D$-dimensions. The most physically relevant conclusion, besides the presence of torsion in the vacuum, is the contribution of the $higher$ grade metric components $g^{\mu_1 \mu_2 ~\nu_1 \nu_2}, g^{\mu_1 \mu_2 \mu_2~\nu_1 \nu_2 \nu_3 }, ~.....$ of the $C$-space metric to dark energy/dark matter.
Category: Quantum Gravity and String Theory

[198] viXra:1208.0219 [pdf] replaced on 2013-05-15 00:39:04

Proof of Quark Confinement and Baryon-Antibaryon Duality: I: Gauge Symmetry Breaking in Dual 4D Fractional Quantum Hall Superfluidic Space-Time

Authors: Andrej E. Inopin, Nathan O. Schmidt
Comments: 16 pages, 10 figures, published in the Hadronic Journal

We prove quark (and antiquark) confinement for a baryon-antibaryon pair and design a well-defined, easy-to-visualize, and simplified mathematical framework for particle and astro physics based on experimental data. From scratch, we assemble a dual 4D space-time topology and generalized coordinate system for the Schwarzschild metric. Space-time is equipped with "fractional quantum number order parameter fields" and topological defects for the simultaneous and spontaneous breaking of several symmetries, which are used to construct the baryon wavefunction and its corresponding antisymmetric tensor. The confined baryon-antibaryon pair is directly connected to skyrmions with "massive 'Higgs-like' scalar amplitude-excitations" and "massless Nambu-Goldstone pseudo-scalar phase-excitations". Newton's second law and Einstein's relativity are combined to define a Lagrangian with effective potential and effective kinetic. We prove that our theory upgrades the prediction precision and accuracy of QCD/QED and general relativity, implements 4D versions of string theory and Witten's M-theory, and exemplifies M.C. Escher's duality.
Category: High Energy Particle Physics

[197] viXra:1208.0217 [pdf] replaced on 2013-01-06 01:50:48

De Combinatoriek Van De Bruijn

Authors: Ton Kloks
Comments: 18 Pages. in Dutch

In memoriam N.G. de Bruijn. In this article I present some highlights of De Bruijn's contributions in combinatorics. This article does not survey his work on eg Penrose tilings, asymptotics or AUTOMATH; other surveys on these topics are being written by others.
Category: Combinatorics and Graph Theory

[196] viXra:1208.0216 [pdf] submitted on 2012-08-25 00:24:27

Analytical Study on Manual vs. Automated Testing Using with Simplistic Cost Model

Authors: V. N. Maurya, Rajender Kumar
Comments: 13 Pages.

The main objective of this research paper focuses on the importance of automated software testing associate with software testing techniques in software engineering. In which we consider, categorized and enlighten on the software testing using current scenario of testing automation. The solution of this problem leads to the new approach of software development known as software testing in the Information Technology world. Software test automation is the process of automating, the steps of manual test cases using an automated tool or utility to shorten the testing life cycle with respect to time. Regression testing is commonly used to efficiently test the system by systematically selecting the appropriate minimum suite of tests needed to adequately cover the affected change. Common methods of regression testing include rerunning previously run tests and checking whether previously fixed faults have re-emerged.
Category: Digital Signal Processing

[195] viXra:1208.0214 [pdf] submitted on 2012-08-23 11:05:52

The Twin Transformations of the Light Wave’s the Frequency and the Wavelength in the 2-Dimension Inertial System

Authors: sangwha Yi
Comments: 5 Pages.

In the special relativity theory, in the 2-Dimension inertial coordinate system, study the twin transformations of the light wave’s the frequency and the wavelength .
Category: Relativity and Cosmology

[194] viXra:1208.0213 [pdf] replaced on 2013-05-18 00:35:33

Rethinking the Formal Methodology (I): Wave-Vortex Essence of the Substance

Authors: George Kirakosyan
Comments: 42 Pages. 9 Figures, Accepted to publication in HADRONIC JOURNAL

An approach/methodology proposed to basic problems, alternative to standard formalism. The opportunity of representation of the quantum phenomena in causality principle, on the wave-field common nature of the substance is shown. Elementary particles physical essence and types of interactions interpreted within wave-field peculiarities. The problems with de Broglie wave and particles’ double slit interference discussed. Physical models of basic hadrons, their internal structure and static fields’ configurations proposed. The values of mass, spin, magnetic moments of n, p hadrons defined within modeling. Causal interpretation to β decay presented. The tremendous penetrating peculiarity of neutrino discussed. Structural schemas to He, C nucleons are proposed.
Category: High Energy Particle Physics

[193] viXra:1208.0212 [pdf] submitted on 2012-08-22 12:04:26

The Mass Effect of Gravitational Potential (Megp)

Authors: Mahgoub Salih
Comments: 4 Pages.

We showed that the effect of gravitational potential is equivalent to the dark matter that should be added to Newtonian dynamic to solve the cosmological problems. The amount of mass effect that could be generated by the gravity was found to be 34% of the universe's mass. We proposed Modified Newtonian dynamics (MOND) as consequence of a mass effect of gravitational potential.
Category: Astrophysics

[192] viXra:1208.0211 [pdf] submitted on 2012-08-22 09:51:13

Two Experimental Consequences of the Theory of Gravitational Relativity

Authors: Joseph A. O. A.
Comments: 40 Pages. An article in volume one of The Fundamental Theory Monograph series on an evolvig fundamental theory by the author.

Two experimentally verifiable implications of the expressions for mass of a test particle and gravitational potential on flat spacetime in a gravitational field of arbitrary strength, in the context of the theory of gravitational relativity (TGR) are derived. The first is the existing expression for the shift in perihelion per revolution of a planet round an assumed spherical Sun, derived until now on a proposed curved spacetime in a gravitational field in the context of the general theory of relativity (GR), with excellent observational support in the case of the planet mercury, which is re-derived in a more compact and more straight forward manner on flat spacetime in the context of TGR. The second is a set of new (hybrid) forces of nature namely, gravi-electric, gravi-magnetic and laser-anti-gravitational forces, which arise when a capacitor (with electric field of extremely large strength between its plates), an electromagnet (with magnetic field of extremely large strength between its pole pieces) and a sphere containing very high energy radial laser beams respectively, each of which is safely enclosed in a box, falls freely towards a gravitational field source on flat spacetime in the context of TGR. Each of these forces is repulsive and opposes the gravitational attraction on a body towards a gravitational field source. While the applications of the gravi-electric and gravi-magnetic forces in achieving anti-gravitational thrust lack any prospect, there is a good promise for the use of the laser-anti-gravitational force to control the weight (or effectively the inertia) of a body (without engine power or any other aid) in the gravitational field of the earth or a host planet.
Category: Relativity and Cosmology

[191] viXra:1208.0210 [pdf] submitted on 2012-08-22 06:36:51

Theory of an Electro-Cordic Field Operating in Quantum Systems II

Authors: R. Wayte
Comments: 32 Pages.

Abstract: The physical nature of proposed electro-cordic guidewaves has been demonstrated by applying the theory to relativistic potential wells and a simple harmonic system. Interference observed in Young's slits and the Michelson interferometer has also been explained as due to active guidewave fields controlling photons. Entanglement is interpreted in terms of real coupling by interlinked guidewaves between particles or photons; so wavefunction collapse occurs when this physical link is broken. Superconductivity requires real material binding for electron-pair creation and correlation.
Category: Quantum Physics

[190] viXra:1208.0209 [pdf] submitted on 2012-08-22 04:08:56

A Possible Solution to the Hard Problem of Consciousness Using Multidimensional Approach

Authors: Alexander Egoyan
Comments: 10 Pages.

In this work a new solution to the hard problem of consciousness using multidimensional approach [1-3] is proposed. It is shown that our perceptions may be interpreted as elastic oscillations of a two dimensional membrane with closed topology embedded in our brains. According to the model our universe is also a three dimensional elastic membrane embedded into the higher dimensional space-time. The model allows us to create a unified world picture where physical and perceptual aspects of the reality are complementary. We can observe our 2d self-membranes through our perceptions, which are encoded in elastic oscillations of the elastic membrane. According to the theory, elastic membranes occupy energetically favorable positions around microtubules involved into Orch OR. Elastic membranes responsible for qualia interact with our brains and provide them with information about the character of incoming stimuli (pleasant or unpleasant), they squeeze to preserve quantum coherent states producing pleasant perceptions and stretch to avoid unpleasant ones.
Category: Mind Science

[189] viXra:1208.0208 [pdf] submitted on 2012-08-21 13:18:45

Theory of an Electro-Cordic Field Operating in Quantum Systems I.

Authors: R. Wayte
Comments: 18 Pages.

Abstract: A theory of electro-cordic guidewaves is developed to supplement the standard acausal statistical laws of quantum mechanics and account for the growth of precision interference patterns from apparently random quantum events. Every effort is made to reveal the physical reality of the guidewaves which organise photons or electrons into predictable states. Einstein’s equations of general relativity have also been applied to hydrogen, to yield energy levels identical to those of Dirac’s theory. A companion paper will cover other applications of electro-cordic guidewaves in quantum theory to interference, tunnelling, non-local phenomena, and superconductivity.
Category: Quantum Physics

[188] viXra:1208.0207 [pdf] replaced on 2012-08-25 15:00:57

Phase Change Dye Experiment for Wisps

Authors: B.D.O. Adams
Comments: 8 Pages. Spelling corrections in new version

We describe a simple experiment to look for currents of very low mass particles interacting via some fifth force. We assume some Baryons also interact with this force providing a connection to normal matter. And we also assume that both Fermion and Boson low mass particles are present. In such a case Fermions will more attracted to low density matter while Bosons would be profered at high density. If such particles are present we might detect a current between condensing gases and boiling liquids. An experiment is designed whereby such a current might turn assymetric dye molecules to point in the direction of the current. Such turned dye molecules may then be detected by their effect on polarised light. We design such a experiment, and perform it at a desktop scale. We hope future experimenters may perform the experiment to better precision, with other fluids and more professional tolerances than we can provide.
Category: High Energy Particle Physics

[187] viXra:1208.0206 [pdf] submitted on 2012-08-21 14:07:33

The Transformation of the Matter Wave’s the Frequency and the Wavelength in the 2-Dimension Inertial Coordinate System

Authors: sangwha Yi
Comments: 4 Pages.

In the special relativity theory, in the 2-Dimension inertial coordinate system, study the transformation of the matter wave’s the frequency and the wavelength .
Category: Relativity and Cosmology

[186] viXra:1208.0205 [pdf] submitted on 2012-08-21 14:15:44

The Acceleration of the 2-Demension Inertial System and the Matter Wave

Authors: sangwha Yi
Comments: 9 Pages.

In the special relativity theory, the acceleration about the accelerated matter that has the initial velocity in 2-Dimension inertial coordinate system and the other acceleration about the accelerated matter that has not the initial velocity in 2-Dimension inertial coordinate system are same. Therefore using it, derive the moving formula and the transformation about the matter wave
Category: Relativity and Cosmology

[185] viXra:1208.0204 [pdf] submitted on 2012-08-20 21:28:38

Fuzzy Grids-Based Intrusion Detection in Neural Networks

Authors: Izani Islam, Tahir Ahmad, Ali H. Murid
Comments: 18 Pages.

The proposed system is developed in two main phases and also a supplementary optimizing stage. At the first phase, the most important features are selected using fuzzy association rules mining (FARM) to reduce the dimension of input features to the misuse detector. At the second phase, a fuzzy adaptive resonance theory‐based neural network (ARTMAP) is used as a misuse detector. The accuracy of the proposed approach depends strongly on the precision of the parameters of FARM module and also fuzzy ARTMAP neural classifier. So, the genetic algorithm (GA) is incorporated into the proposed method to optimize the parameters of mentioned modules in this study. Classification rate (CR) results show the importance role of GA in improving the performance of the proposed intrusion detection system (IDS). The performance of proposed system is investigated in terms of detection rate (DR), false alarm rate (FAR) and cost per example (CPE).
Category: Data Structures and Algorithms

[184] viXra:1208.0203 [pdf] replaced on 2013-08-27 17:54:35

Linear and Angular Momentum Spaces for Majorana Spinors

Authors: Leonardo Pedro
Comments: 26 Pages. corrected the Proofs of Propositions 2.10, 3.5 and 6.13

In a Majorana basis, the Dirac equation for a free spin one-half particle is a 4x4 real matrix differential equation. The solution can be a Majorana spinor, a 4x1 real column matrix, whose entries are real functions of the space-time. Can a Majorana spinor, whose entries are real functions of the space-time, describe the energy, linear and angular momentums of a free spin one-half particle? We show that it can. We show that the Majorana spinor is an irreducible representation of the double cover of the proper orthochronous Lorentz group and of the full Lorentz group. The Fourier-Majorana and Hankel-Majorana transforms are defined and related to the linear and angular momentums of a free spin one-half particle.
Category: High Energy Particle Physics

[183] viXra:1208.0202 [pdf] submitted on 2012-08-20 12:33:59

The Maxwell Equations, the Lorentz Field and the Electromagnetic Nanofield with Regard to the Question of Relativity

Authors: Daniele Sasso
Comments: 10 Pages.

We discuss the Electromagnetic Theory in some main respects and specifically with relation to the question of Relativity. Let us consider the Maxwell equations for the representation of the electromagnetic field and the Lorentz force for the description of the motion of a particle in a field of magnetic induction. The Lorentz force is also useful for describing the behavior of the electromagnetic field in presence of cut flux, that is the physical situation which happens with respect to interactive moving reference systems. We examine at last physics of the electromagnetic nanofield which is essential for the definition of the behaviour of single energy quanta that compose the energy radiation. Let us have to accept that physics and science have an insurmountable limit and that they can give an answer to all possible questions except the first and the last. Those who want to give an answer to these two questions have to leave physics and science with their method and to go into the the order of philosophy or of religion.
Category: Relativity and Cosmology

[182] viXra:1208.0201 [pdf] submitted on 2012-08-20 11:31:30

The Universe an Effect Without Cause

Authors: Philip Gibbs
Comments: 11 Pages. FQXi essay 2012

Through the history of science we have become accustomed to experiencing paradigm shifts in our fundamental understanding of the Universe. Previously-cherished principles have been abandoned by radical thinkers in order to free them of the constraints that were hindering progress. Copernicus ousted the geocentric worldview that had been the dogma for centuries and Einstein led us to abandon the absolutes of time and space introduced by Newton, then Heisenberg took away certainty leaving us to accept unavoidable unpredictability in the laws of nature. In each case the revolutionary move was met with strong resistance from the ruling guard of physicists, but eventually victory fell into the hands of a new generation of thinkers. Each of these revolutionary changes came as a surprise, but the next great shift in thinking will be different in that it has long been anticipated. Physicists already expect that some former assumptions will be tomorrow’s sacrifices in the battle to understand the nature of reality. They know that everyday senses, intuition and philosophical prejudice cannot be trusted when exploring the fundamental laws that prevail in physical regimes that are not part of our ordinary experience. They have seen it all before and all agree that something important has to give before the next breakthrough can be struck. I think it is clear that space and time will be the first casualties of this revolution. They will become emergent properties of a deeper reality. That is the easier part but with them, locality and causality must also fail. Of these it is temporal causality – the principle that every effect has a preceding cause – that is the hardest for scientists to lose. In this essay I discuss why this must happen and what can take its place.
Category: History and Philosophy of Physics

[181] viXra:1208.0200 [pdf] submitted on 2012-08-20 03:38:01

A Mild Generalization of Zariski's Lemma

Authors: Pierre-Yves Gaillard
Comments: 2 Pages.

We give a mild generalization of Zariski's Lemma.
Category: Algebra

[180] viXra:1208.0198 [pdf] submitted on 2012-08-19 10:43:49

Experimental Work on Horizontal Axis PVC Turbine Blade of Power Wind Mill

Authors: H. S. Patil
Comments: 11 Pages.

Growing concern for the environment degradation has led to the world’s interest in renewable energy sources. Wind energy is rapidly emerging as one of the most cost-effective forms of renewable energy with very significant increases in annual installed capacity being reported around the world. The favoured form of turbines used for electricity generation purposes is the Horizontal Axis Wind Turbine (HAWT) with low solidity ratio (ratio of blade area to swept area) and high tip speed ratio, λ = ΩR/Vwind, where R is the radius of the blades and Vwind is the wind velocity. This type of turbine has a high efficiency or coefficient of performance (Cp), but relatively low torque. Wind energy is kinetic energy associated with the movement of atmospheric air. Wind energy systems for irrigation & milling have been in use since ancient times & since beginning of 20th century it is being used to generate electric power. Windmills for water pumping have been installed in many countries particularly in the rural areas. Wind turbines transform the energy in the wind into mechanical power, which can then be used directly for grinding etc. or further converting to electric power to generate electricity. Wind turbines can be used singly or in clusters called ‘wind farms’. Small wind turbines called aero-generators can be used to charge large batteries. Five nations –Germany, USA, Denmark, Spain & India account for 80% of the worlds installed wind energy capacity. Wind energy continues to be fastest growing renewable energy source with worldwide wind power installed capacity. India ranks 5th in the world with a largest wind power capacities which have been established in commercial projects. In India the states Tamilnadu & Gujarat lead in the field of wind energy. There about a dozen wind pumps of various designs providing water for agriculture & domestic purposes all scattered over the country. Today India is a major player in global wind energy market. The present work was originally devised as a student project to examine the possibility of developing a small scale, high torque, self-starting HAWT for applications such as water pumping. In the following we outline the development of the concept of the PVC type HAWT, the development of a experimental setup of the device that includes the design, manufacture, commissioning and preliminary testing of the device
Category: General Science and Philosophy

[179] viXra:1208.0197 [pdf] submitted on 2012-08-19 10:44:37

Analysis of the Effect of Changes in Fuel Injector Position on Gas Turbine Combustion Efficiency Using Large Eddy Simulation (Les)

Authors: Shandy Kharisma Irianto, Gunawan Nugroho
Comments: 20 Pages.

Researches on gas turbine have been widely performed, especially on combustion chamber with a non-premixed combustion process. In this work, the research ofthe combustion chamber has been performed using the LES method. The aim of this study is to analyze the combustion process with various fuel injector angles and fuel rate. The analyzed aspects are flow pattern, temperature distribution and species concentration on stoichiometry condition. The mixing process between fuel and air at the combustion zone is affected by the increase in injector angle which, 33.55o (model 1) being the optimum injector angle to obtain the highest efficiency of 84.6%. It is noted that model 2 (injector 45o ) has the lowest emission containing waste gases of CH4 and CO, i.e. 0.75 and 0.089 times lower than those of model 1, respectively. However, the combustion efficiency of model 2 is lower than model 1, i.e. 82.7%.
Category: General Science and Philosophy

[178] viXra:1208.0196 [pdf] submitted on 2012-08-19 10:45:29

Finite Element Analysis of Stresses Caused by External Holes in Hydraulic Cylinders

Authors: A.N. El Kholy, M. A. Kamel, M. O. Mousa
Comments: 11 Pages.

This paper evaluates simulations of holes in the wall of cylinder. The stresses generated incrementally in Finite Element Method under internal pressure. The holes, which can consider as a stress raiser, established in external surface. The effect of the hole depth, which varied between 0.5 to 4.5 mm, and the hole diameter, which varied between 1 to 2.5 mm, on the generated stresses are presented. It was found that, the hoop stress increase due to increase of the hole parameter, diameter and depth. Moreover, the characterizations of notch used to determine the maximum stress limit.
Category: General Science and Philosophy

[177] viXra:1208.0195 [pdf] submitted on 2012-08-19 10:46:37

Static Response of Transversely Isotropic Elastic Medium with Irregularity Present in the Medium

Authors: Dinesh Kumar Madan, Anita Dahiya, Shamta Chugh
Comments: 11 Pages.

In the present paper, the closed form expressions for the displacements at any point of the transversely isotropic elastic medium with irregularity present in the medium have been obtained. A model is considered in which the irregularity is expressed by a rectangle shape and the medium is taken in a state of free from initial stress. To study the effect of irregularity present in the medium , the variation of displacements with horizontal distance have been drawn for different values of irregularity size. Also the comparison between the displacements for isotropic and transversely isotropic elastic medium is shown graphically. It is found that the irregularity have a notable effect on this deformation.
Category: General Science and Philosophy

[176] viXra:1208.0194 [pdf] submitted on 2012-08-19 10:48:22

TQM Model for the Competitive Advantages of an Electromechanical Company

Authors: Fabio De Felice, Antonella Petrillo
Comments: 18 Pages.

The present work shows the results of a study realized in an electromechanical company aiming at the qualification of a particular process, completely automatized. The work has been carried out by the employment of statistical techniques and instruments of Problem Solving. In particular, the determination of the causes of the main problems on this line has been realized by means of instruments such as the Ishikawa Diagram and Scatter Plots and Stratification. It was made the attempt to intervene on the main causes of the problems and to reduce each time the dispersion of the output values around the tendency value.
Category: General Science and Philosophy

[175] viXra:1208.0193 [pdf] submitted on 2012-08-19 10:49:22

Titanium – Aluminium Intermetallic Thin Films Preparation by DC Sputtering and Their Characterization

Authors: Sudheendra P., A. O. Surendranathan, N. K. Udayashamkar, K. S. Choudhari
Comments: 6 Pages.

DC magnetron sputtering is a well-developed deposition technique for coatings and thin films used in industrial applications. The experiments were performed with unbalanced circular magnetron sputtering targets of aluminium (99.999%) and titanium (99.99%). Sputtering of aluminium (Al) and titanium (Ti) was carried out in pure argon (99.999%) atmosphere at base pressure of 4 × 10−6 torr and constant sputtering pressure of 5 × 10−3 torr. Substrate materials were mainly stainless steel (304) and aluminum plates. Characterization of TiAl films deposited onto different substrates was evaluated using XRD, SEM and EDS analysis techniques. The film surface and cross-section was examined using a scanning electron microscope (SEM). The TiAl phase was confirmed using XRD analysis. The composition of the TiAl film was determined using EDS technique. These characterizations revealed the growth of TiAl intermetallic thinfilm with a characteristic crystallite size of 123.9 Å and a lattice strain of 0.1352%. Also a columnar growth perpendicular to the substrate surface was observed repeatedly in our experiment. The microhardness of the TiAl film had an average value of 1873 HV.
Category: General Science and Philosophy

[174] viXra:1208.0192 [pdf] submitted on 2012-08-19 10:50:16

Performance Effects of Increase in Hydrogen Percentage by Volume on CNG Sequential Injection 3 Cylinder S.i Engine

Authors: P. T. Nitnaware, J. G. Suryawanshi
Comments: 9 Pages.

Now days the Global warming has become the important issue. Nation is also facing the fuel crises due to increase in Automobiles sector. The blends of Hydrogen and CNG have the potential to satisfy Euro V norms with margin. Experimentation on 3 cylinder water cooled SI engine with eddy current Dynamometer, modified to CNG sequential Gas Injection system with varying percentage of hydrogen by volume shown the reduction in emission and increase in power output.
Category: General Science and Philosophy

[173] viXra:1208.0191 [pdf] submitted on 2012-08-19 10:51:44

Effects of Inertia and Process Parameters on Isothermal High Speed Two-Layer Filament Jet Flow

Authors: Md. Abdul Wakil, Z. U. Ahmed
Comments: 9 Pages.

This paper investigates the influences of inertia and process parameters on two-layer fiber spinning process for incompressible, isothermal and Newtonian filament jet flow. The present study focuses on the steady flow considering inertia, gravity and non-uniform velocity of each layer across the fiber. The governing equations are solved numerically as nonlinear two-point boundary value problem given the analytical solution is practically impossible. The effects of inertia and initial process conditions (draw ratio, initial velocity ratio and die exit radius ratio) are discussed. The velocity increases monotonically with the axial position in each layer due to inertia effect, at a rate that is relatively slower (faster) near the spinneret (take-up point) as Re increases. In contrast, the radii decrease monotonically with the axial position in each layer.
Category: General Science and Philosophy

[172] viXra:1208.0190 [pdf] submitted on 2012-08-19 10:53:00

Study on Fuel Properties of Various Vegetable Oil Available in Bangladesh and Biodiesel Production

Authors: Md. Abdul Wakil, Z.U. Ahmed, Md. Hasibur Rahman, Md. Arifuzzaman
Comments: 8 Pages.

The present review aims to study the prospects and opportunities of introducing vegetable oils and their derivatives as fuel in diesel engines. Some fuel properties are always available in vegetable oils. In this investigation Cottonseed oil, Mosna oil and Sesame oil are chosen for producing biodiesel as an alternative fuel for diesel engine. Fuel-related properties of these oils are reviewed and compared with those of conventional diesel fuel. Biodiesel is produced by transesterifying the oil with an alcohol such as methanol under mild conditions in the presence of a base catalyst. Satisfactory amount of biodiesel is produced from Cottonseed oil at 3:1M ratio of methanol and oil. Biodiesel from cottonseed oil has various fuel properties which are similar to diesel. The cost of biodiesel production is also analyzed. This paper discusses in a general a perspective of biodiesel.
Category: General Science and Philosophy

[171] viXra:1208.0189 [pdf] submitted on 2012-08-19 10:55:39

Some Innovations in Design of Low Cost Variable Compression Ratio Two Stroke Petrol Engine

Authors: A.Srinivas, G.Venkatasubbaiah, P.Venkateswar rao, M. Penchal Reddy
Comments: 9 Pages.

Historically two stroke engine petrol engines find wide applications in construction of two wheelers worldwide, however due to stringent environmental laws enforced universally; these engines are fading in numbers. In spite of the tight norms, Internationally these engines are still used in agriculture, gensets etc. Several designs of variable compression ratio two stroke engines are commercially available for analysis purpose. In this present investigation a novel method of changing the compression ratio is proposed, applied, studied and analyzed. The clearance volume of the engine is altered by introducing a metal plug into the combustion chamber. This modification permitted to have four different values of clearance value keeping in view of the studies required the work is brought out as two sections. This paper deals with the design, analysis testing at different compression ratios, modification and engine fabrication. It is observed that the increase in compression ratio improves fuel efficiency and power output. The novelty in this work is to permit the two wheeler driver to change the compression ratio
Category: General Science and Philosophy

[170] viXra:1208.0188 [pdf] submitted on 2012-08-19 10:12:51

The Wave Equation and Spherical Time

Authors: Gary D. Simpson
Comments: 24 Pages.

The mathematical treatment of time is revised to be spherical rather than linear. The resulting wave equation is solved for spherical time and space. The resulting solution has a finite value for r = 0 and t = 0. Also, it is simple to produce and equation that is inversely proportional to r^2 as is necessary for gravity and electro-statics.
Category: Quantum Physics

[169] viXra:1208.0187 [pdf] submitted on 2012-08-19 10:36:46

Analysis of a Wind Turbine Blade Profile for Tapping Wind Power at the Regions of Low Wind Speed

Authors: S. P. Vendan, S. Aravind Lovelin, M. Manibharathi, C. Rajkumar
Comments: 10 Pages.

The project is aimed at designing a wind turbine for tapping the low speed wind in urban locations. It is to be noted that most of the high wind power density regions in the zone of high wind speed are already being tapped and this offers a large scope for the development of this low wind speed turbines. Our study focuses primarily on designing the blade for tapping power in the regions of low wind power density. The aerodynamic profiles of wind turbine blades have crucial influence on aerodynamic efficiency of wind turbine. This involves the selection of a suitable airfoil section for the proposed wind turbine blade. The NACA 63 series is chosen as the basic group for investigation because they have good low speed characteristics and the power curve is better in the low and medium wind speed ranges. In this paper NACA 63-415 airfoil profile is considered for analysis of wind turbine blade. NACA 63-415 airfoil profile is created by using the co-ordinate file generated in JavaFoil. A C-Mesh Domain for the fluid around the airfoil is created using Design Modeler in ANSYS Workbench. The CFD analysis is carried out using STAR-CCM+ at various angles of attack from 00 to 160 . The coefficient of lift and drag values are calculated for low Reynolds number and the pressure distributions are also plotted. The airfoil NACA 63-415 is analyzed based on computational fluid dynamics to identify its suitability for its application on wind turbine blades and good agreement is made between results.
Category: General Science and Philosophy

[168] viXra:1208.0186 [pdf] submitted on 2012-08-19 10:37:55

Numerical Investigation of Supersonic Nozzle Producing Maximum Thrust For Altitude Variation

Authors: Muhammad Misbah-Ul Islam, Mohammad Mashud, Md. Hasan Ali, Abdullah Al Bari
Comments: 18 Pages.

The concentration of this numerical investigation is focused to generate data for developing optimum profile of Supersonic nozzle irrespective of the altitude of operation. The investigation has been carried out for different altitudes when the combustion conditions including combustion temperature, combustion pressure, specific heat ratio and molecular weight remain unchanged. Considering the aerodynamic issues, method of characteristics is chosen for profile generation. During the application of method of characteristics, exit pressure to atmospheric pressure ratio is maintained unity. The coding has been done in the MATLAB interface with an aim to generate maximum thrust at the outlet of the optimized supersonic nozzle. Both Mach and pressure distribution for maximum thrust generation are within the domain of interest of this investigation.
Category: General Science and Philosophy

[167] viXra:1208.0185 [pdf] submitted on 2012-08-19 10:38:50

Numerical Simulation to Describe the Soybean Glycine Max (L.) Drying Process: Influence of Air Velocity, Temperature and Initial Moisture Content

Authors: Camila Nicola Boeri, Oleg Khatchatourian
Comments: 12 Pages.

The production of soybean requires that the product is collected healthy and in advance, to minimize losses caused by the attack in the field of insects, diseases and microorganisms. Therefore and due to high moisture content at harvest, drying is one of the operations of primary importance among the techniques involved in the conservation of desirable qualities of products of plant origin. The objective of this work is to obtain the drying curves of soya, in the range of drying air temperature between 45 and 90°C, for the initial moisture content between 0.13 and 0.32 and drying air velocity of 0, 0.5, 0.9, 1.5 and 2.5m/s to determine the influence of these parameters in the process. The experimental phase was performed using a prototype of which was a hair metal tube with 0.15m in diameter, insulated throughout its surface with glass wool and canvas. The air was heated by six electrical resistance with power of 600W, while the temperature was controlled with the aid of thermocouples connected to the drying equipment. Were also performed numerical simulations, where the mathematical model used was proposed by Khatchatourian [1], and this search has changed the equation that describes the flow of mass, it is entering the parameters of air velocity and initial moisture content, obtaining a good agreement between experimental and simulated data. Was observed that the drying air velocity presents significant influence on the process, there is an increased withdrawal of water during the first hours of drying. Note that the influence of air temperature on the rate of drying is higher at the beginning of the experiment, reducing the processing time. The higher the temperature and airflow, the greater the drying rate and lower the total time of exposure to heated air.
Category: General Science and Philosophy

[166] viXra:1208.0184 [pdf] submitted on 2012-08-19 10:39:43

Large Eddy Simulation (Les) of Effect in Swirl Number to the Efficiency of Gas Turbine Combustion

Authors: Farid Viko Bhaskarra, Gunawan Nugroho
Comments: 9 Pages.

A good mixing process are required in designing gas turbine combustor. Numerical simulations using Large Eddy Simulations are well suited to address these issues. In this study, a numerical simulation of non reacting flow in gas turbine combustor was performed. There were 5 variations of swirler’s angles (50 , 150 , 200 , 250 and 300 ). Perfomances of these new swirler were investigated. The main target of this investigation is to get the effect of swirler’s angle to combustion recirculation zone. The results show that the longest flame stagnation point of 45,26022 mm was obtained at 250 of swirler’s angle.
Category: General Science and Philosophy

[165] viXra:1208.0183 [pdf] submitted on 2012-08-19 10:40:54

Life Prediction of Aileron Actuator Using Finite Element Analysis

Authors: Byeong-Sam Kimi, Kyoungwoo Park, Hyeon-Hee Kim
Comments: 8 Pages.

The presented paper describes the application of a modern fatigue prediction tool based on FE-analysis results to highly specific to aerospace industry, fatigue life prediction a problem on a actuator system. The wings are mounted inside the actuator system in order to needs of aileron design and kinematic motion system and structural analysis, to ensure the structural safety analysis results are presented. FE Analysis can provide the estimation of the crack growth curves with sufficient accuracy, even in case of complicated aileron actuator structures which are crucial for preserving aileron integrity and which participate in transfer of load. Probability of crack detection or any other damage detection is a result of many factors.
Category: General Science and Philosophy

[164] viXra:1208.0182 [pdf] submitted on 2012-08-19 10:42:05

Free Vibration Analysis of Rectangular Plates Using Galerkin-Based Finite Element Method

Authors: Neffati M. Werfalli, Abobaker A. Karoud
Comments: 9 Pages.

In the present work a study of free vibration of thin isotropic rectangular plates with various edge conditions is conducted. This study involves the obtaining of natural frequencies by solving the mathematical model that governs the vibration behavior of the plate using a Galerkin-based finite element method. Cubic quadrilateral serendipity subparametric elements with twelve degrees of freedom are used in this analysis. Even though the order of polynomial used is the lowest possible, the effectiveness of the method for calculating the natura l frequencies accurately is demonstrated by comparing the solution obtained against the existing analytical results. The effect of the aspect ratio, the number of elements, and the number of sampling points on the accuracy of the solution is also presented.
Category: General Science and Philosophy

[163] viXra:1208.0181 [pdf] submitted on 2012-08-19 10:43:02

Prospect of Bio-Diesel Production from Soybean Oil and Sesame Oil: an Alternative and Renewable Fuel for Diesel Engines

Authors: Md. Abdullah Al Bari, Hasan Ali, Mizanur Rahman, Rakibul Hossain
Comments: 7 Pages.

Energy is the prerequisite for modern civilization. Fossil fuel is still the main source of energy. But the endless consumption of fossil fuel has brought its reserve about to an end. As a result, fuel prices are gouging as a consequence of spiraling demand and diminishing supply. So we are always in search of alternative and cost effective fuels to meet our need. Diesel engines are more efficient and cost-effective than other engines. So diesel engines have versatile uses (i.e. automobiles, irrigation, power plants etc.). That is why; consumption of diesel fuel is much higher than other gasoline fuels. This paper estimates the feasibility of soybean oil and sesame oil as an alternative fuel for diesel engine. In the present paper, production of biodiesel from soybean oil and sesame oil, its properties and comparison of test results with the results of other biodiesels and diesel have been presented. Biodiesels are produced experimentally from soybean and sesame oils and obtained 89.75% and 82.64% respectively. Calorific values of biodiesels from soybean and sesame oil are obtained 41.57MJ/Kg and 43.67 MJ/Kg and the same for diesel is 44.5 MJ/Kg. The kinematic viscosity of biodiesel extracted from soybean and sesame oils are 2.068*106m2/s and 2.292*106m2/s respectively while the same for diesel is 2.068*106m2/s. Again, flash point of biodiesels from soybean and sesame oil are obtained 96°C & 94°C and the same for diesel is 75°C. The production costs of biodiesels from soybean and sesame oil are Tk. 296.8 and Tk. 370 per liter respectively. These oils or any of its blends could be used as an alternative in case of crisis.
Category: General Science and Philosophy

[162] viXra:1208.0180 [pdf] submitted on 2012-08-19 08:55:33

Realization of Canopen Communication Control for Underactuated Anthromorphic Finger

Authors: Mohamad Khairi Ishak, Jamaludin Jalani
Comments: 11 Pages.

This paper proposes an alternative control communication system through CANopen application which will be used for controlling an underactuated anthromorphic fingers. It is anticipated that the CANopen network can be developed easily and reliable to integrate with Bristol Elumotion Robot Hand (BERUL). The real-time network has to incorporate into dSPACE and a well-known Matlab Simulink-based controller prototyping system. Experimental result has proved that the CANopen is reliable to be implemented for underactuated anthromorphic fingers.
Category: General Science and Philosophy

[161] viXra:1208.0179 [pdf] submitted on 2012-08-19 08:56:58

Analysis of Piezoelectric Actuator for Vibration Control of Thin Cylindrical Shells

Authors: V. K. Srivastava
Comments: 10 Pages.

An analytical method is proposed for selecting the best suited actuator which can preferably be used in a variety of structural control applications. The selection is based on matching performance characteristics of the actuator, such as force and displacement, to the requirements of the given task. Relations between the mid surface strains and the strain induced in piezoelectric actuator due to application of electric field are derived to optimize the thickness of piezoelectric layer.
Category: General Science and Philosophy

[160] viXra:1208.0176 [pdf] submitted on 2012-08-19 09:00:15

Through-Thickness Shearing Effects on Geometric Non-linear Behavior of Thin and Thick Functionally Graded Plates under Pressure Loads

Authors: M. Hajikazemi, M. H. Sadr, M. Ramezani-Oliaee
Comments: 11 Pages.

The effects of through-the-thickness shearing strain energy on the geometric non-linear behavior of thin and relatively thick rectangular functionally graded plates are studied in this paper. It is assumed that the mechanical properties of the plates, graded through the thickness, are described by a simple power law distribution in terms of the volume fractions of constituents. The plates are assumed to be under lateral pressure loads. The fundamental equations for rectangular plates of FGM are obtained using the classical laminated plate theory (CLPT), first order shear deformation theory (FSDT) and higher order shear deformation theory (HSDT) for large deflection and the solution is obtained by minimization of the total potential energy.
Category: General Science and Philosophy

[159] viXra:1208.0175 [pdf] submitted on 2012-08-19 09:06:16

Modification of Determination Procedure of Numerical 3D Correction Factor (β) of Arcan Specimen

Authors: Masood Nikbakht, NaghdAli Choupani, Hossein Hosseini Todeshki
Comments: 8 Pages.

In this paper the mixed-mode interlaminar fracture behavior of Carbon-Epoxy composite specimens was investigated based on numerical analyses. Study of behavior of composite materials and determining their ultimate strength seems to be an essential issue in practical engineering. Hence, the behavior of Carbon-Epoxy laminated composite is studied numerically by modeling of Arcan specimen in ABAQUS finite element software. The modeling was fulfilled in the way that loading can be carry out in different loading angles and also analyses is repeated for wide range of crack length ratio between 0.1 to 0.9. The numerical analysis was performed with ABAQUS finite element software under a constant load of 1000 N. the entire test apparatus is modeled in both 2 dimensional and 3 dimensional. Results of numerical analyses are demonstrated is several diagrams. Also, a hypothesis about boundary conditions of 2 dimensional models is investigated and has been proved. The results show that some of conventional constraints must be modified to extract right correction factors from finite element models.
Category: General Science and Philosophy

[158] viXra:1208.0174 [pdf] submitted on 2012-08-19 09:06:58

Modelling and Simulation of Plate Heat Exchanger

Authors: Naseem Ahmad Khan, Wasi ur Rahman
Comments: 9 Pages.

This paper presents simulation investigation of a plate heat exchanger. Basically, it includes the development of a mathematical model to describe its operation and analysis. The model, after testing against the existing experimental data, has been solved to obtain the effect of various parameters like mass flow rate, number of flow channels, plate configuration and f low patterns. Model of a plate heat exchanger has been described by a set of continuity, momentum and energy equations with a number of simplifying assumptions. Heat transfer rate equation has also been included in the energy balance equation to take care of phenomena occurring therein. Mathematical model has been solved by the use of finite difference technique with interval of Δt =0.005s and Δz = 0.005m to obtain the transient and steady state behavior.
Category: General Science and Philosophy

[157] viXra:1208.0173 [pdf] submitted on 2012-08-19 09:07:43

Production of Biodiesel from Fluted Pumpkin (Telfairia Occidentalis Hook F.) seeds Oil

Authors: E.I. Bello, S.A. Anjorin, M. Agge
Comments: 10 Pages.

In this study, the work done on the extraction of oil from fluted pumpkin (Telfairia Occidentalis Hook F.) seeds, its transesterification methyl ester (biodiesel) and characterization is reported. The oil was extracted in a soxhlet extractor using normal hexane as solvent. The oil properties were measured and the free fatty acid was 3.59 mg KOH/g which is high for alkaline transesterification hence the oil was neutralized with hydrochloric acid before transesterification using with 3 g of sodium hydroxide per litre of methanol as catalyst and methoxide/oil in the volume ratio of 6:1. Gas chromatography analys is shows that the oil and its methyl ester contains primarily the short chain fatty acids oleic (C18:1), linoleic (C18:2). The fuel properties were evaluated following the American Society for Testing and Materials (ASTM) methods for biodiesel. The fuel properties are very close to those of diesel fuel hence can be used as alternative fuel for diesel engines. Of particular importance is the high flash point which makes it a safe fuel and the low pour point will allows it to be used in cold climate.
Category: General Science and Philosophy

[156] viXra:1208.0172 [pdf] submitted on 2012-08-19 07:14:48

Microstructural and Tribological Characterization of Tin Coated Aluminium Alloy (Al6061)

Authors: Anil Kumar H C, N.K. Udayashankar, Sudheendra P, H.S. Hebbar
Comments: 6 Pages.

Titanium Nitride (TiN) was deposited on aluminium alloy Al6061 using reactive DC magnetron sputtering technique. X-Ray diffraction and EDAX confirmed the presence of TiN phase in the coating. Optical microscopy showed the golden bronze coloured TiN coating at a N2/Ar ratio of 0.47. Maximum composite Microhardness hardness of 2210 (HK) was obtained at 5g load for the coating deposited at a ratio (N2/Ar) of 0.60. The dry sliding wear behaviour was studied by pin-on-disc machine. Oxidation wear prevailed during the sliding test. Coatings deposited at a ratio (N2/Ar) of 0.47 and 0.60 showed better wear resistance as compared to uncoated specimens.
Category: General Science and Philosophy

[155] viXra:1208.0171 [pdf] submitted on 2012-08-19 07:17:59

Appl Ication of the Hazard Rate Measure in Studying Change of Temperature in a Pulsating Heat Pipe

Authors: Pranab K. Barua
Comments: 6 Pages.

In this article, we have discussed the importance of applying the hazard rate measure in studying matters related to change of temperature in pulsating heat pipes. It has been found that the hazard rate decreases with increase in diameter of the heat pipe. Finally, it has been validated statistically that the hazard rate increases exponentially as the number of turns in the evaporator section of the heat pipe increases.
Category: General Science and Philosophy

[154] viXra:1208.0170 [pdf] submitted on 2012-08-19 07:18:43

Anti-Swing Control Strategy for Automatic 3 D.O.F Crane System Using FLC

Authors: Jamaludin Jalani
Comments: 9 Pages.

This paper presents a control strategy to overcome positioning control and anti-swing control for a 3 Degree-of-Freedom (D.O.F) crane system. It is well known that the 3 D.O.F crane system is a type of machine, generally equipped with a hoist, wire ropes or chains, and sheaves. It can be used to lift and lower materials and to move them horizontally. However, controlling the 3 D.O.F crane systems requires a good control method to achieve a high positioning control and in particular to suppress swing that produced during operation. Hence, choosing an appropriate control to resolve positioning control and swing angle is not a trivial task in particular to transfer payloads quickly, effectively and safely. Presently, the existing of 3 DOF systems used a conventional PID controller to control position and swing angle. The controllers were designed based on the model and parameter of the crane system. In general, modelling and parameter identifications are troublesome and time consuming. Therefore, we propose a Fuzzy Logic Control (FLC) which has simpler and practical design approach. Effectively, it is anticipated that the FLC can be used to avoid a complex mathematical calculation which is always time consuming. In addition, the model derivation is often inaccurate due to the presence of nonlinearities and uncertainties. Throughout this paper, the FLC performances are compared with PID controller through experiment. The results showed that FLC has produced good result for positioning and anti-swing control for 3 D.O.F. crane system.
Category: General Science and Philosophy

[153] viXra:1208.0169 [pdf] submitted on 2012-08-19 07:19:35

Optimization of 16” Plug Valve Body Using FEA And Experimental Stress Analysis Method

Authors: Deokar Vinayak Hindurao, D.S.Chavan
Comments: 5 Pages.

In the field of competition, all companies should supply their goods and services with high quality, in shortest period with lower prices than its competitors in order to keep their capacity and power to compete .Plug valves are machine elements which are commonly used for regulation of fluid, semi-liquid and granular medium flow on variety of tanks and pipeline systems. This paper discusses FEA analysis of Plug–valve body followed by Experimental stress analysis using strain gauge method for weight optimization. New optimized models were prepared on the basis of validation of the results obtained from stress analys is procedure. The weight reduction is done by changing the wall and rib thickness. The results clearly shows the maximum weight reduction is 24.86 kg (5.26%) weight of original weight while keeping maximum stress level up to 168.6 N/mm2 which is safe for the applied load.
Category: General Science and Philosophy

[152] viXra:1208.0168 [pdf] submitted on 2012-08-19 07:20:47

Investigation of Thermal Performance of Electric Vehicle BLDC Motor

Authors: James Kuria, Pyung Hwang
Comments: 17 Pages.

Overreliance on petroleum products and environmental pollution from combustion emissions produced by automobiles has led to extensive research on hybrid electric vehicles, electric vehicles and their components. A key component in these vehicles is the electric motor, used for traction as well as powering other appliances like the compressor. Overheating in electrical motors results in detrimental effects such as degradation of the insulation materials, magnet demagnetization, increase in Joule losses and decreased motor efficiency and lifetime. Hence, it is important to find ways of optimizing performance and reliability of electric motors through effective cooling and consequently reduce operating and maintenance costs. This study describes 3D CFD simulations performed on a totally enclosed air over fan cooled brushless D.C. motor to identify the temperatures of the critical components of the motor, and the effect of varying thermal parameters of these temperatures. The energy sources are obtained from electromagnetic losses computed using MAXWELL, a commercial FEA software and bearing losses obtained through numerical methods developed by the authors. A finned casing is used as the heat sink and the effect of varying the fin geometry on the cooling performance is examined using three heat sink designs. The results show that the highest temperature occurs at the end windings and that this temperature can be reduced by up to 15% by introduction of a suitable finned housing. These results show that CFD can be effectively used to optimize the cooling performance of electric motors. Experimental tests are undergoing in order to validate the CFD results.
Category: General Science and Philosophy

[151] viXra:1208.0167 [pdf] submitted on 2012-08-19 07:21:37

Effect of Post-Process Artificial Aging Treatment on Tensile Properties of Sic Particle Reinforced Aa6061-T6 Surface Metal Matrix Composite Fabricated Via Friction Stir Process

Authors: Devaraju Aruri, Adepu Kumar, B Kotiveerachary
Comments: 11 Pages.

This paper reports on studies of the influence of post-process artificial aging (PPAA) treatment on tensile properties of SiC particles reinforced AA6061-T6 surface metal matrix composite via Friction stir process (FSP). In FSPed composite the SiC particles were uniformly distributed in stir zone without any defect and exhibited higher micro hardness than as-received Al alloy. FSPed composite was exhibited lower tensile properties compared to as-received Al and after the application of post-process artificial aging treatment the tensile properties were increased around 50% than the untreated material at 170 o C for soaking period of 16hr.
Category: General Science and Philosophy

[150] viXra:1208.0166 [pdf] submitted on 2012-08-19 07:22:34

Prediction on Reduction of Emission of Nox in Diesel Engine Using Bio-Diesel Fuel and Egr (Exhaust Gas Recirculation) System

Authors: Pooja Ghodasara, M.S. Rathore
Comments: 8 Pages.

Environmental degradation and depleting oil reserves are matters of concern round the globe. The search for energy independence and concern for cleaner environment have generated significant interest in biodiesel. It has shown that biodiesel fuelled engine produce less carbon monoxide, unburnt hydrocarbon and smoke emission compared to diesel fuel but higher NOx emission. EGR is as effective technique to reduce NOx from diesel engine as it lowers flame temperature and reduce oxygen concentration in combustion chamber. The objective of this research is to investigate the usage of biodiesel and EGR simultaneously in order to reduce the emissions of all regulated pollutants from diesel engine. For this a single cylinder, air cooled, constant speed direct injection diesel engine was used and EGR was developed and fitted in engine. Various emissions such as HC, NOx, CO and smoke opacity were measured. The engine performance parameters were calculated from measured data.
Category: General Science and Philosophy

[149] viXra:1208.0165 [pdf] replaced on 2013-01-05 04:51:02

Electron Spin Precession for the Time Fractional Pauli Equation

Authors: Hosein Nasrolahpour
Comments: 7 Pages.

In this work, we aim to extend the application of the fractional calculus in the realm of quantum mechanics. We present a time fractional Pauli equation containing Caputo fractional derivative. By use of the new equation we study the electron spin precession problem in a homogeneous constant magnetic field.
Category: Quantum Physics

[148] viXra:1208.0164 [pdf] submitted on 2012-08-19 05:05:45

Quantum Model for the Direct Currents of Becker

Authors: Matti Pitkanen
Comments: 25 Pages.

Robert Becker proposed on basis of his experimental work that living matter behaves as a semiconductor in a wide range of length scales ranging from brain scale to the scale of entire body. Direct currents flowing only in preferred direction would be essential for the functioning of living manner in this framework. One of the basic ideas of TGD inspired theory of living matter is that various currents, even ionic currents, are quantal currents. The first possibility is that they are Josephson currents associated with Josephson junctions but already this assumption more or less implies also quantal versions of direct currents. TGD inspired model for nerve pulse assumed that ionic currents through the cell membrane are probably Josephson currents. If this is the case, the situation is automatically stationary and dissipation is small as various anomalies suggest. One can criticize this assumption since the Compton length of ions for the ordinary value of Planck constant is so small that magnetic flux tubes carrying the current through the membrane look rather long in this length scale. Therefore either Planck constant should be rather large or one should have a non-ohmic quantum counterpart of a direct current in the case of ions and perhaps also protons in the case of neuronal membrane: electronic and perhaps also protonic currents could be still Josephson currents. This would conform with the low dissipation rate. In the following the results related to laser induced healing, acupuncture, and DC currents are discussed first. The obvious question is whether these direct currents are actually currents and whether they could be universal in living matter. A TGD inspired model for quantal direct currents is proposed and its possible implications for the model of nerve pulse are discussed.
Category: Physics of Biology

[147] viXra:1208.0163 [pdf] submitted on 2012-08-19 05:02:18

How to Build a Quantum Computer from Magnetic Flux Tubes?

Authors: Matti Pitkanen
Comments: 6 Pages.

Magnetic flux tubes play a key role in TGD inspired model of quantum biology. Could the networks of magnetic flux tubes containing dark particles with large \hbar in macroscopic quantum states and carrying beams of dark photons define analogs of electric circuits? This would be rather cheap technology since no metal would be needed for wires. Dark photon beams would propagate along the flux tubes representing the analogs of optical cables and make possible communications with maximal signal velocity. I have actually made much more radical proposal in TGD inspired quantum biology. According to this proposal, flux tube connections are dynamical and can be changed by reconnection of two magnetic flux tubes. The signal pathways A→ C and B→ D would be transformed to signal pathways to A→ D and B→ C by reconnection. Reconnection actually represents a basic stringy vertex. The contraction of magnetic flux tubes by a phase transition changing Planck constant could be fundamental in bio-catalysis since it would allow distant molecules connected by flux tubes to find each other in the molecular crowd. DNA as a topological quantum computer is the idea that I have been developing for 5 years or so. I have concentrated on the new physics realization of braids and devoted not much thought to how the quantum computer problems might run in this framework. I was surprised to realize how little I know about what happens in even ordinary computation. Instead of going immediately to Wikipedia I take the risk of publicly making myself fool and try to use my own brain.
Category: Physics of Biology

[146] viXra:1208.0162 [pdf] submitted on 2012-08-19 04:57:58

Does Thermodynamics Have a Representation at the Level of Space-Time Geometry?

Authors: Matti Pitkanen
Comments: 9 Pages.

R. Kiehn has proposed what he calls Topological Thermodynamics (TTD) as a new formalism of thermodynamics. The basic vision is that thermodynamical equations could be translated to differential geometric statements using the notions of differential forms and Pfaffian system. That TTD differs from TGD by a single letter is not enough to ask whether some relationship between them might exist. Quantum TGD can however in a well-defined sense be regarded as a square root of thermodynamics in zero energy ontology (ZEO) and this leads leads to ask seriously whether TTD might help to understand TGD at deeper level. The thermodynamical interpretation of space-time dynamics would obviously generalize black hole thermodynamics to TGD framework and already earlier some concrete proposals have been made in this direction. This raises several questions. Could the preferred extremals of Kähler action code for the square root of thermodynamics? Could induced Kähler gauge potential and Kähler form (essentially Maxwell field) have formal thermodynamic interpretation? The vacuum degeneracy of Kähler action implies 4-D spin glass degeneracy and strongly suggests the failure of strict determinism for the dynamics of Kähler action for non-vacuum extremals too. Could thermodynamical irreversibility and preferred arrow of time allow to characterize the notion of preferred extremal more sharply? It indeed turns out that one can translate Kiehn's notions to TGD framework rather straightforwardly. Kiehn's work 1- form corresponds to induced Kähler gauge potential implying that the vanishing of instanton density for Kähler form becomes a criterion of reversibility and irreversibility is localized on the (4-D) "lines" of generalized Feyman diagrams, which correspond to space-like signature of the induced metric. Heat produced in given generalized Feynman diagram is just the integral of instanton density and the condition that the arrow of geometric time has definite sign classically fixes the sign of produced heat to be positive. In this picture the preferred extremals of Kähler action would allow a trinity of interpretations as non-linear Maxwellian dynamics, thermodynamics, and integrable hydrodynamics.
Category: Quantum Gravity and String Theory

[145] viXra:1208.0161 [pdf] submitted on 2012-08-19 02:23:48

Kernel based Object Tracking using Colour Histogram Technique

Authors: Prajna Parimita Dash, Dipti patra, Sudhansu Kumar Mishra, Jagannath Sethi
Comments: 8 Pages.

Object tracking is the process of locating moving objects in the consecutive video frames. Real time object tracking is a challenging problem in the field of computer vision, motion-based recognition, automated surveillance, traffic monitoring, augmented reality, object based video compression etc. In this paper kernel based object tracking using color histogram technique has been applied for different challenging situations. Covariance tracking algorithm has also been applied to the same problem. From the simulation studies it is clear that this two techniques effectively handle various challenges like occlusion, illumination changes etc. Experimental results reveal that the histogram based method is efficient in terms of computation time and covariance tracker is better in terms of detection rate.
Category: Digital Signal Processing

[144] viXra:1208.0160 [pdf] submitted on 2012-08-19 02:24:41

Techniques for Improving Performance in Managed Overlays Network

Authors: Mohanjeet Singh, D.S Dhaliwal, Neeraj Garg, Gurdas Singh
Comments: 7 Pages.

During the last years, overlay networks have become one of the most prominent tools for Internet research and development. With overlay networks networking users, developers and application users can easily design and implement their own communication environment and protocols on top of the Internet. It is network on the top of existing network provide additional services. It is a virtual network of nodes and logical links on the top of existing network but the network defines addressing, routing and service model for communication b/w hosts. Some applications in which overlay networks are distributed systems such as cloud computing, peer-to-peer networks, and client-server applications. It is used in designing own environment like data routing and file sharing management. In this paper we will discuss various parameters which effect the performance of managed overlays.
Category: Digital Signal Processing

[143] viXra:1208.0159 [pdf] submitted on 2012-08-19 02:25:55

Distribution Transformer Random Transient Suppression using Diode Bridge T-type LC Reactor

Authors: Leong Bee Keoh, Mohd Wazir Mustafa, Sazali P. Abdul Karim
Comments: 8 Pages.

A new application of diode bridge T-type LC reactor as the transient suppressor for distribution transformer is presented. The proposed transient suppressor is effective in reducing the peak overvoltage and voltage steepness by a factor of two. The transient suppressor is connected to the upstream of the protected transformer to mitigate transients induced by lightning. The approach developed enables one to construct a simple and low cost transient suppressor with negligible effects on the systems steady state operation. The proposed transient suppressor is numerically tested using simulation package PSCAD.
Category: Digital Signal Processing

[142] viXra:1208.0158 [pdf] submitted on 2012-08-19 02:27:08

Segmentation and Analysis of Microscopic Osteosarcoma Bone Images

Authors: Anand Jatti, S.C.Prasannakumar, Ramakanth Kumar
Comments: 10 Pages.

Characteristics of microscopic osteosarcoma bone cross section images carry essential clues for defining the important features in the bone cross section such as stroma, bone, malignant cell, myxoid matrix, artifact for different age groups and also for age related developments & diseases. The traditional approaches of bone microscopic image analysis rely primarily on manual processes with very limited number of bone samples which is very difficult to get reliable and consistent conclusions. A new method of hybrid technique of image segmentation which uses microscopic images for processing is proposed. This hybrid segmentation technique automates the bone image analysis process and is able to produce reliable results based on qualitative measurements of the features extracted from the microscopic bone images. The study of correlation of bone structural features and age related developments & diseases become feasible from large databases of bone images.
Category: Digital Signal Processing

[141] viXra:1208.0156 [pdf] replaced on 2012-11-13 01:48:49

Is it Really Higgs?

Authors: Matti Pitkanen
Comments: 43 Pages.

The discovery of a new spinless particle at LHC has dominated the discussions in physics blogs during July 2012. Quite many bloggers identify without hesitation the new particle as the long sought for Higgs although some aspects of data do not encourage the interpretation as standard model Higgs or possibly its SUSY variant. Maybe the reason is that it is rather imagine any other interpretation. In this article the TGD based interpretation as a pion-like states of scaled up variant of hadron physics is discussed explaining also why standard model Higgs - by definition provider of fermion masses - is not needed. Essentially one assumption, the separate conservation of quark and lepton numbers realized in terms of 8-D chiral invariance, excludes Higgs like states in this sense as also standard N=1 SUSY. One can however consider Higgs like particles giving masses to weak gauge bosons: motivation comes from the correctly predicted group theoretical W/Z mass ratio. The pion of M89 hadron physics is the TGD proposal for a state behaving like Higgs and its decays via instanton coupling mimic the decays of Higgs to gauge boson pairs. For this option also charged Higgs like states are prediction. The instanton coupling can however generate vacuum expectation value of pion and this indeed happens in the model for leptopion. This would lead to the counterpart of Higgs mechanism with weak bosons "eating" three components of Higgs. This is certainly a problem. The solution is that at microscopic level instanton density can be non-vanishing only in Euclidian regions representing lines of generalized Feynman diagrams. It is Euclidian pion - a flux tube connecting opposite throats of a wormhole contact which develops vacuum expectation whereas ordinary pion is Minkowskian and corresponds to flux tube connecting throats of separate wormhole contacts and cannot develop vacuum expectation. This identification could explain the failure to find the decays to τ pairs and also the excess of two-gamma decays. The decays gauge boson pairs would be caused by the coupling of pion-like state to instanton density for electro-weak gauge fields. Also a connection with the dark matter researches reporting signal at 130 GeV and possibly also at 110 GeV suggests itself: maybe also these signals also correspond to pion-like states.
Category: High Energy Particle Physics

[140] viXra:1208.0154 [pdf] submitted on 2012-08-18 12:28:32

On Demand Quality of Web Services Using Ranking by Multi Criteria

Authors: N. Rajanikanth, P.Pradeep Kumar, B. Meena
Comments: 7 Pages.

In the Web database scenario, the records to match are highly query-dependent, since they can only be obtained through online queries. Moreover, they are only a partial and biased portion of all the data in the source Web databases. Consequently, hand-coding or offline-learning approaches are not appropriate for two reasons. First, the full data set is not available beforehand, and therefore, good representative data for training are hard to obtain. Second, and most importantly, even if good representative data are found and labeled for learning, the rules learned on the representatives of a full data set may not work well on a partial and biased part of that data set.
Category: Digital Signal Processing

[139] viXra:1208.0153 [pdf] submitted on 2012-08-18 12:51:08

Failure Modes in Embedded Systems and Its Prevention

Authors: Samitha Khaiyum, Y S Kumaraswamy
Comments: 5 Pages.

Systems failures do not occur in a vacuum; while a single event may trigger the failure, investigation often reveals that a history of managerial and technical decisions produce conditions turning a single event into a disaster. At the minimum, investigating case studies provides lessons on what to avoid. By systematic studies of failure, it may be possible to draw general conclusions and improve practice as a whole. Unfortunately, good systems failure studies are rare. Embedded systems failure is a volatile topic and the field is filled with a vast amount of noise, urban myth, and political agendas.
Category: Digital Signal Processing

[138] viXra:1208.0152 [pdf] submitted on 2012-08-18 12:55:38

Lobar Fissure Extraction in Isotropic CT Lung Images - An Application to Cancer Identification

Authors: T.Manikandan, N. Bharathi
Comments: 5 Pages.

The essential organ for respiration and inspiration of human beings are Lungs. It consists of five distinct lobes which are separated by three fissures (the boundaries of lung lobes are the areas containing fissures and having absence of bronchial trees). They are two oblique (left and right) fissures and one horizontal fissure. The left lung consist of left oblique fissure which separates the superior and middle lobes. The right lung consist of right oblique fissure which separates superior and middle lobes and right horizontal fissure which separates middle and inferior lobes. The identification of the lobar fissures in isotropic Computed Tomography (CT) images are very difficult even for the experienced surgeons because of its variable shape and appearance along with low contrast and high noise association with it. Further the fissure thickness is observed to be around 2 pixels (approximately 1.2mm) complicates the fissure identification. The identification of lobar fissure in CT images will be helpful for the surgeon to identify the cancer location before they plan for surgery. The surgical removal of the diseased lung is the final stage for treating the lung cancer. Therefore it is necessary to find the cancer location at the early stage to treat it. This paper presents an automated method to extract the left and right oblique fissures from the CT lung images. The proposed method is implemented in two phases. In the first phase, the fissure region is located. In the second phase, the found lobar fissures are extracted. The obtained results show that the proposed work can help the surgeon to identify the cancer location.
Category: Digital Signal Processing

[137] viXra:1208.0150 [pdf] submitted on 2012-08-18 12:59:29

A New Approach to Spam Mail Detection

Authors: R. Jensi
Comments: 4 Pages.

The ever increasing menace of spam is bringing down productivity. More than 70% of the email messages are spam, and it has become a challenge to separate such messages from the legitimate ones. I have developed a spam identification engine which employs naive Bayesian classifier to identify spam. A new concept-based mining model that analyzes terms on the sentence, document is introduced. . The concept-based mining model can effectively discriminate between non-important terms with respect to sentence semantics and terms which hold the concepts that represent the sentence meaning. The proposed mining model consists of sentence-based concept analysis, document-based concept analysis similarity measure. In this paper, a machine learning approach based on Bayesian analysis to filter spam is described. The filter learns how spam and non spam messages look like, and is capable of making a binary classification decision (spam or non-spam) whenever a new email message is presented to it. The evaluation of the filter showed its ability to make decisions with high accuracy. This cost sensitivity was incorporated into the spam engine and I have achieved high precision and recall, thereby reducing the false positive rates.
Category: Digital Signal Processing

[136] viXra:1208.0149 [pdf] submitted on 2012-08-18 13:00:37

Access Control for Healthcare Data Using Extended XACML-SRBAC Model

Authors: A. A. Abd EL-Aziz, A.kannan
Comments: 4 Pages.

In the modern health service, data are accessed by doctors and nurses using mobile, Personal Digital Assistants, and other electronic handheld devices. An individual’s health related information is normally stored in a central health repository and it can be accessed only by authorized doctors. However, this Data is prone to be exposed to a number of mobile attacks while being accessed. This paper proposes a framework of using XACML and XML security to support secure, embedded and fine-grained access control policy to control the privacy and data access of health service data accessed through handheld devices. Also we consider one of the models, namely Spatial Role-based access control (SRBAC) and model it using XACML.
Category: Digital Signal Processing

[135] viXra:1208.0148 [pdf] submitted on 2012-08-18 13:02:50

New Algorithm “DRREQ” Applied in AODV Route Discovery Mechanism for Energy Optimization in Mobile Ad hoc Networks

Authors: A.A.Boudhir, M. Bouhorma, M. Ben Ahmed, Elbrak Said
Comments: 6 Pages.

In Mobile Ad hoc Networks, The route discovery mechanism utilize simple flooding method, where a mobile node massively rebroadcasts received route request (RREQ) packets until a route to destination is established. This can cause more retransmissions and thus excessive energy consumption. This work aims to reduce the number of messages RREQ (Route Request), by a new probabilistic and dichotomic method for the discovery of destination. This method can significantly reduce the number of RREQ packets transmitted during route discovery operation. Our simulation analysis can result in significant performance improvements in terms of energy. We propose a new probabilistic algorithm DRREQ (Dichotomic Route Request) which minimize significantly the energy consumption. Extensive simulations have been carried out for evaluation of the performance of our algorithm in different scenarios.
Category: Digital Signal Processing

[134] viXra:1208.0147 [pdf] submitted on 2012-08-18 13:03:56

Pixel Level Satellite Image Fusion Using Component Substitution Partial Replacement

Authors: Asha G, Annes Philip
Comments: 10 Pages.

Image fusion is capable of integrating different imagery to produce more information than can be derived from a single sensor. Preservation of spectral information and enhancement of spatial resolution are regarded as important issues in remote sensing satellite image fusion. In this paper, a component substitution partial replacement is proposed to merge a high-spatial-resolution panchromatic (PAN) image with a multispectral image. This method generates high-/low-resolution synthetic component images by partial replacement and uses statistical ratio-based high-frequency injection. Remote sensing satellite image, such as IKONOS-2 were employed in the evaluation. Experiments showed that this approach can resolve spectral distortion problems and successfully conserve the spatial information of a PAN image. Thus, the fused image obtained from the proposed method gave higher fusion quality than the images from some other methods.
Category: Digital Signal Processing

[133] viXra:1208.0146 [pdf] submitted on 2012-08-18 13:05:14

A Novel Merge sort

Authors: D.Abhyankar, M.Ingle
Comments: 6 Pages.

Sorting is one of the most frequently needed computing tasks. Mergesort is one of the most elegant algorithms to solve the sorting problem. We present a novel sorting algorithm of Mergesort family that is more efficient than other Mergesort algorithms. Mathematical analysis of the proposed algorithm shows that it reduces the data move operations considerably. Profiling was done to verify the impact of proposed algorithm on time spent. Profiling results show that proposed algorithm shows considerable improvement over conventional Mergesort in the case of large records. Also, in the case of small records, proposed algorithm is faster than the classical Mergesort. Moreover the proposed algorithm is found to be more adaptive than Classical Mergesort.
Category: Data Structures and Algorithms

[132] viXra:1208.0145 [pdf] submitted on 2012-08-18 14:34:01

F4 and E8: Wrong Assumption: E8 Cannot Unify Fermions and Bosons. Useful Truth: F4 and E8 Lie Algebras Have Both Commutator and Anticommutator Structure.

Authors: Frank Dodd Tony Smith Jr
Comments: 6 Pages.

Realistic Physics models must describe both commutator Bosons and anticommutator Fermions so that spin and statistics are consistent. The usual commutator structure of Lie Algebras can only describe Bosons, so a common objection to Physics models that describe both Bosons and Fermions in terms of a single unifiying Lie Algebra (for example, Garrett Lisi's E8 TOE) is that they violate consistency of spin and statistics by using Lie Algebra commutators to describe Fermions. However, Pierre Ramond has shown in hep-th/0112261 as shown that the exceptional Lie Algebra F4 can be described using anticommutators as well as commutators. This essay uses the periodicity property of Real ticommutators as well as commutators so that it may be possible to construct a realistic Physics model that uses the exceptional LiClifford Algebras to show that E8 can also be described using ane Algebra E8 to describe both Bosons and Fermions. E8 also inherits from F4 Triality-based symmetries between Bosons and Fermions that can give the useful results of SuperSymmetry without requiring conventional SuperPartner particles that are unobserved by LHC.
Category: High Energy Particle Physics

[131] viXra:1208.0144 [pdf] submitted on 2012-08-18 19:39:31

The Theory of Electrodynamic Space-Time Relativity (Revision 1)

Authors: Yingtao Yang
Comments: 15 Pages.

The theory of electrodynamic space-time relativity is the study of the transformation of time and space between two electrodynamic inertial reference frames, which have both inertial velocity difference and electric potential difference. It is a fundamental theory of theoretic physics based on the Einstein’s special theory of relativity and the high-precision experimental facts of the inversion proportional square law of Coulomb’s force. It founded new physical space-time concepts, for example, that our space-time is five-dimensional which is composed of quaternion space, and time. It also proposed some new basic concepts of physics, such as electric potential limit, quaternion electric potential and etc., revealing the inherent relationships between electric potential-velocity and time-space. This paper discusses in detail the process of establishing the theory of complex variable electrodynamic space-time relativity and theory of quaternion electrodynamic space-time relativity as well as their various conversions and transformations. This is one of the basic theories among author’s other series of related papers. Keyword: theory of electrodynamic space-time relativity; theory of complex electrodynamic space-time relativity; theory of quaternion electrodynamic space-time relativity; theory of electric potential relativity; special theory of relativity; electric potential limit; complex velocity; complex electric potential; quaternion velocity; quaternion electric potential; quaternion; five-dimensional space-time
Category: Relativity and Cosmology

[130] viXra:1208.0143 [pdf] submitted on 2012-08-18 20:20:52

The Speed of Electromagnetic Radiation

Authors: Glenn A. Baxter
Comments: Five pages

The physical nature of the speed of light is discussed and the so called “Relativity Principle” is defined and discussed.
Category: Relativity and Cosmology

[129] viXra:1208.0141 [pdf] submitted on 2012-08-18 21:31:11

Representing System Processes using the Actor Model / Processor Net

Authors: Anthony Spiteri Staines
Comments: 11 Pages.

This paper describes the issue that modern systems, being composed of multiple components, have certain processing requirements that need to be properly addressed. Unfortunately very few models and notations provide for the idea of process modeling. For this purpose, separate notations have to be used. This work proposes the use of notations that have a specific focus on process modeling concepts. The solution is to use the Actor Model/ Processor Net. The solution is taken a step further by suggesting the use of Processor Network Patterns. These can be useful for describing and categorizing typical behavior in different types of systems.
Category: Digital Signal Processing

[128] viXra:1208.0140 [pdf] submitted on 2012-08-18 21:32:21

An Evolutionary Algorithm for Mining Association Rules Using Boolean Approach

Authors: G.Ravi Kumar, G.A. Ramachandra, G.Sunitha
Comments: 6 Pages.

Frequent pattern mining is one of the active research themes in data mining. It plays an important role in all data mining tasks such as clustering, classification, prediction, and association analysis. Identifying all frequent patterns is the most time consuming process due to a massive number of patterns generated. A reasonable solution is identifying efficient method to finding frequent patterns without candidate generation. In this paper, we present An Evolutionary algorithm for mining association rules using Boolean approach for mining association rules in large databases of sales transactions. Traditional association rule algorithms adopt an iterative method to discovery, which requires very large calculations and a complicated transaction process. Because of this, a new association rule algorithm is proposed in this paper. This new algorithm scanning the database once and avoiding generating candidate itemsets in computing frequent itemset and adopts a Boolean vector “relational calculus” method to discovering frequent itemsets. Experimental results show that this algorithm can quickly discover frequent itemsets and effectively mine potential association rules.
Category: Digital Signal Processing

[127] viXra:1208.0139 [pdf] submitted on 2012-08-18 21:33:29

Joint Design of Cluster-Based Hierarchical Networking Architecture and Key Management System for Heterogeneous Wireless Sensor Networks

Authors: McKenzie McNeal III, Wei Chen, Sachin Shetty
Comments: 18 Pages.

Current communication protocols used for Wireless Sensor Networks (WSNs) have been designed to be energy efficient, low redundancy in sensed data, and long network lifetime. One major issue that must be addressed is the security of data communication. Due to the limited capabilities of sensor nodes, designing security-based communication protocols present a difficult challenge. Since commonly used encryption schemes require high time complexity and large memory, achieving security for WSNs needs unique approaches. In this paper, we consider a heterogeneous wireless sensor network (HWSN), where while most nodes are resource limited a few nodes can have more energy, stronger processing capability and longer communication range and can be used to relax the resource bottleneck. We propose a joint design approach that can best use the benefit that a HWSN brings. We first design a reconfigurable hierarchical networking architecture, where the HWSN is divided by the high-end nodes into regions, the low-end nodes in each region are divided by clusters, and high-end nodes and cluster heads form a communication/relay backbone. We then design a key management system that uses both public and symmetric key cryptography above the hierarchical networking architecture which requires very small number of security keys. The evaluation and simulation results show that by using the proposed networking architecture and key management scheme only a small amount of keys needs to be preloaded before deployment and stored after key setup to achieve secured communication throughout the entire network.
Category: Digital Signal Processing

[126] viXra:1208.0138 [pdf] submitted on 2012-08-18 21:34:21

Secure Control of Remote Electrical Devices Using Mobile SMS Services

Authors: Kishor T. Mane, G.A. Patil
Comments: 9 Pages.

In this paper an attempt is made to extend the capability of mobile phones for secure control of remote electrical devices using SMS services. The transmission of SMS in GSM network is not secure. It only provides the encryption between Mobile Station to Base Station Controller. The message transmitted between GSM operator networks is not encrypted and hence is not safe. Mobile SMS service has been used for control purposes in various applications; but the control operations seem not to be secured. Therefore, it is desirable to secure SMS by adding suitable cryptographic algorithm so as to perform operations securely on certain crucial remote devices. In this system blowfish algorithm has been enhanced for its suitability to increase the security on the parameters like Avalanche effect, GA, key-size, and others. The results obtained are far better in above terms.
Category: Digital Signal Processing

[125] viXra:1208.0137 [pdf] submitted on 2012-08-18 21:35:49

Computer Viruses in UNIX Environment: Case Study

Authors: Asmaa Shaker Ashoor, Sharad Gore, Vilas Kharat
Comments: 11 Pages.

All of people who don’t know how to use a computer have heard a bout viruses through programs such as hackers and some means like that. There is no doubt that our culture is fascinated by the potential danger of these viruses. Computer virus have become threat to computer users and almost every field in the advance technology industrial nowadays. Know about virus is very necessary for anti-virus researchers as well as operating systems makers. With the development of the open source systems today, computer viruses on these systems should be considered strictly. The goal of this paper is to present my concept of classification virus computer in UNIX environment. This paper provides some subjective comments on some of the most widely known environment and some methods available to protect UNIX today. propose some viruses that can work on this environment and suggest some methods to prevent as well as restrain damages of these viruses.
Category: Digital Signal Processing

[124] viXra:1208.0136 [pdf] submitted on 2012-08-18 21:38:30

Modeling Uml2 Activity Diagram by Using Graph Transformation Systems and Abstract State Machine

Authors: Somayeh Azizi, Vahid Panahi
Comments: 15 Pages.

Graphs and diagrams provide a simple and powerful approach variety of problems that are typical to computer science, for example for activities. In software development visual notation are used for modeling that including activity diagram, class diagram, control flow, graphs and some another diagrams. Models based on these notation can be seen a graph and graph transformation are involved. So Abstract State Machine (ASM) is a modern computation model. ASM based tools are used to academia and industry, albeit on a modest scale. They allow we to give high-level operational semantics to computer artifacts and to write executable specifications of software and hardware at the desired abstraction level. The token flow semantics of UML2 activity diagrams is formally defined using Abstract State Machines and Graph Transformation System. The state of the art in semantics for UML2 activity diagrams covers three distinct approaches: mapping to Petri-nets, using graph transformation rules, or providing pseudo-code. ASM using pseudo- code and graph transformation system using graph transformation rules for determining semantics. A major goal of this paper is ability to determine the correctness behavior and formal semantics of UML2 activity diagram by Graph Transformation System and Abstract state machine.
Category: Digital Signal Processing

[123] viXra:1208.0135 [pdf] submitted on 2012-08-18 21:39:40

A Brief Study on Human Bone Anatomy and Bone Fractures

Authors: N.Umadevi, S.N.Geethalakshmi
Comments: 12 Pages.

The rapid and continuing progress in computerized medical image reconstruction and the associated developments in analysis methods and computer-aided diagnosis have propelled medical imaging as one of the most important sub-fields in scientific imaging. One important area is the fracture detection from x-ray images, where automated tools are used to detect fractures. In order to develop an automated fracture detection system, a clear understanding of human skeletal system, bone and fractures is required. This paper provides description on each of these topics.
Category: Digital Signal Processing

[122] viXra:1208.0134 [pdf] submitted on 2012-08-18 21:40:38

Towards Optimizing Learning Paths in a System e-Learning Adaptive Application of Ant Algorithm (Aco)

Authors: Samir ALLACH, Mohammad ESSAAIDI, Mohamed BEN AHMED
Comments: 11 Pages.

Adaptive E-learning, refers to a training concept in which technology is introduced step by step in all aspects of the business of training. A technique, inspired by the Ant Colony Optimization (ACO), is proposed to optimize the learning path. The proposed platform is modeled by a graph where nodes represent the educational elements (lessons or exercises), and arcs link navigation between them. Each of the arcs is also a value that describes its importance in relation to teaching neighboring arcs. Students are represented by virtual agents (ants) who use these links.
Category: Digital Signal Processing

[121] viXra:1208.0133 [pdf] submitted on 2012-08-18 21:42:29

Scheduling Algorithm for Beacon Safety Message Dissemination in Vehicular Ad-hoc Networks Increasing QoS

Authors: Mohammad Ali azimi, Mohamad reza Ramezanpor
Comments: 9 Pages.

In this paper we address one research challenges in vehicular ad hoc networks (VANETs), Safety message dissemination as follows. We investigate the quality of service for beacon based safety applications in VANETs. We point out the fact that safety applications have some distinctiveness which is not addressed in the current literature. We therefore propose a new metric focusing on the significance of coverage in safety scenarios called effective range which is based on the satisfaction of very restrict quality of service. We also emphasis that beacon based safety applications may tolerate some message loss Beacon safety message dissemination in Vehicular Ad-hoc Networks (VANETs) suffers from poor reliability especially in congested road traffics. The main origin of this problem is CSMA nature of Dedicated Short Range Communications (DSRC) in MAC layer. In this paper, a scheduling algorithm in the application layer is proposed to alleviate the problem. We first divide the road into a number of geographical sections. In each section, we form a cluster between moving vehicles. Then we perform a scheduling algorithm including two levels. In the first level, nonadjacent clusters can transmit at the same time. While the second level of scheduling deals with inside of each cluster in which we implement a TDMA-like approach. Simulation results show that the proposed algorithm improves the reliability in beacon message dissemination. Moreover, the accuracy of the information which each vehicle can obtain from its neighboring vehicles is significantly increased. Thus the proposed scheduling scheme leads to considerable enhancement of the safety level provided by Beacon Safety Messages (BSMs).
Category: Digital Signal Processing

[120] viXra:1208.0132 [pdf] submitted on 2012-08-18 21:43:36

Stuttered Speech: An Acoustic Study

Authors: Virupakashipura Krishna Tara, Kempahanumiah Muniswamappa Ravi Kumar
Comments: 8 Pages.

The purpose of this study is to compare the duration characteristic of individual words and entire Passage in the speech of adults who stutter (S=10) recorded near the onset of their stuttering to those of controlled nonstuttering adults (C=10). Stuttered speech was identified in digital recordings of the clients read speech. The digitized signals were analyzed by means of Cool Edit Pro software. Using visual displays of sound spectrograms, the durations of individual words (including repeated words) in the passage and entire passage duration were analyzed. In this work 80% of data were used for training and remaining 20% for testing over all accuracy.
Category: Digital Signal Processing

[119] viXra:1208.0131 [pdf] submitted on 2012-08-18 21:44:33

The Proposal of a New Image Inpainting Algorithm

Authors: Ouafek Naouel, M. Khiredinne Kholladi
Comments: 7 Pages.

In the domain of image inpainting or retouching, many recent works focus on combining methods of different fields of research in order to obtain more accurate results, and more original images. In this paper we propose a new algorithm that combines three different methods, each one represent a separate field. The first one for the use of artificial intelligence, the second one for the use of the partial differential equation (PDE) and the last one for the use of texture synthesis to reconstruct damages images.
Category: Digital Signal Processing

[118] viXra:1208.0130 [pdf] submitted on 2012-08-18 21:45:40

MS: Multiple Segments with Combinatorial Approach for Mining Frequent Itemsets Over Data Streams

Authors: K Jothimani, S. Antony Selvadoss Thanmani
Comments: 9 Pages.

Mining frequent itemsets in data stream applications is beneficial for a number of purposes such as knowledge discovery, trend learning, fraud detection, transaction prediction and estimation. In data streams, new data are continuously coming as time advances. It is costly even impossible to store all streaming data received so far due to the memory constraint. It is assumed that the stream can only be scanned once and hence if an item is passed, it can not be revisited, unless it is stored in main memory. Storing large parts of the stream, however, is not possible because the amount of data passing by is typically huge. In this paper, we study the problem of finding frequent items in a continuous stream of items. A new frequency measure is introduced, based on a variable window length. We study the properties of the new method, and propose an incremental algorithm that allows producing the frequent itemsets immediately at any time. In our method, we used multiple s egments for handling different size of windows. By storing these segments in a data structure, the usage of memory can be optimized. Our experiments show that our algorithm performs much better in optimizing memory usage and mining only the most recent patterns in very less time.
Category: Digital Signal Processing

[117] viXra:1208.0129 [pdf] submitted on 2012-08-18 21:51:23

Wireless Sensor Network

Authors: Madhav Bokare, Anagha Ralegaonkar
Comments: 7 Pages.

Sensor networks are expected to play an essential role in the upcoming age of pervasive computing. Due to their constraints in computation, memory, and power resources, their susceptibility to physical capture, and use of wireless communications, security is a challenge in these networks. In this paper we just take a glance at the wireless technology and take a tour of wireless sensor networks. This paper gives brief outline related to wireless sensor network and its applications in various fields. Also we have given the software and hardware platforms for wireless sensor network. Also we mentioned the possible attacks on the WSN and their countermeasures. Finally we have pointed out that for designing a sensor network one must build a mechanism which is secure from external attackers.
Category: Digital Signal Processing

[116] viXra:1208.0128 [pdf] submitted on 2012-08-18 21:53:34

Identification of Corneal Aberrations by Using Computer Techniques

Authors: Baki Koyuncu, Pınar Kocabaşoğlu
Comments: 8 Pages.

The objective was to study the relative contributions of optical aberrations of the cornea and determine the irregularities across its surface area. Corneal topographic imaging data is used and corneal aberrations are computed by using corneal height maps. Mathematical modeling of cornea surface is developed by using Zernike polynomials and they are compared with the patient corneas. Simulation techniques are utilized to determine the amount of corrections with respect to an ideal cornea in computer environment.
Category: Digital Signal Processing

[115] viXra:1208.0127 [pdf] submitted on 2012-08-18 21:54:50

Additive Features in Mobile Short Messaging Services with Specific Refrence to Reminders and Formatting

Authors: Niky K. Jain, Kamini H. Solanki
Comments: 8 Pages.

The exponential growth of the Short Message Service (SMS) use has transformed this service in a widespread tool for social and commercial messaging. It is one of the highly used and well-tried mobile services with global availability within all GSM networks. This paper represents both text formatting features and how the mobile message can be sent at particular date and time to the Short Message Service Centre (SMSC) in advance. Message text is formatted with different looks as done in MS-word which attract the person to use SMS. As a input parameter for text formatting, user have to select the text and give the desire output by selecting different controls. We can add the extra features with formatted text , is sending SMS with particular date and time given by sender. The input parameter for conditionally applied message, will go inside to satisfy the different conditions. If the conditions satisfy it will give desired output and that output will be sent to receiver message mobile number. And on client side we can generate the conditions and the desired solution. Desired input data of the record which satisfy the condition and then it sends to the mobile message automatically and stores all the information into the databases for a particular message. Its input will be stored into a file. Our mobile scripting language supports DSN –less connectivity. It takes the current path of the input file from the storage card or phone memory from where it will stores the information. Controlling of SMS can be done by putting restrictions for particular date and time of mobile message; due to SMS operator for sending mobile message. Our software supports scripting language and database also. Write the message into the body and using DSN-LESS database in the form of record as per the requirements and condition of the columns. In the background the script will take care of database connection.
Category: Digital Signal Processing

[114] viXra:1208.0126 [pdf] submitted on 2012-08-18 21:56:02

QoS Routing for Heterogeneous Mobile Ad Hoc Networks

Authors: Mohammed Abdul Waheed, K Karibasappa
Comments: 5 Pages.

Ad hoc networks which have seen drastic increase in their usage scenarios and convergence of different applications. Efficient routing is very important for mobile ad hoc networks. Existing protocols for ad hoc networks provide little support for QoS and security. In many ad hoc networks, multiple types of nodes do coexist; and some nodes have larger transmission power, higher transmission data rate, better processing capability and are more robust against bit errors and congestion than other nodes. Hence, a heterogeneous network model is more realistic and provides many advantages We present a new routing protocol called QoS routing, which is specifically designed for heterogeneous Mobile Ad Hoc Networks. QoS routing utilizes the more powerful nodes as backbone nodes (B nodes). The routing area is divided into several small, equal-sized cells. One B-node is maintained in each cell, and the routing among B-node is very efficient and simply based on location information and cell structure. A source discovers a route to destination in an on-demand way, and most of the routing activities are among B-nodes. This reduces the number of routing hops and makes the routing more efficient and reliable, since B-nodes have large bandwidth, transmission range, and are more reliable.
Category: Digital Signal Processing

[113] viXra:1208.0125 [pdf] submitted on 2012-08-18 21:57:25

Zernike Moments and Neural Networks for Recognition of Isolated Arabic Characters

Authors: Mustapha Oujaoura, Rachid El Ayachi, Mohamed Fakir, Belaid Bouikhalene, Brahim Minaoui
Comments: 9 Pages.

The aim of this work is to present a system for recognizing isolated Arabic printed characters. This system goes through several stages: preprocessing, feature extraction and classification. Zernike moments, invariant moments and Walsh transformation are used to calculate the features. The classification is based on multilayer neural networks. A recognition rate of 98% is achieved by using Zernike moments.
Category: Digital Signal Processing

[112] viXra:1208.0124 [pdf] submitted on 2012-08-18 21:58:17

Minimizing the Broadcast Routing in the Computer Networks

Authors: Ahmed Younes
Comments: 7 Pages.

In the computer networks, it is necessary for one device in the computer network to send data to all the other devices. In this paper, broadcast routing algorithms that aim to minimize the cost of the routing path are proposed. A minimum cost broadcast routing scheme based on spanning tree algorithm is presented. Thus, we present a genetic algorithm to find the broadcast routing tree of a given network in terms of its links for using it to solve this problem. The algorithm uses the connection matrix of a given network to find the spanning trees, and also is based on the weight of the links to obtain the minimum spanning tree. An example is provided to illustrate the effectiveness of this algorithm over conventional algorithms.
Category: Digital Signal Processing

[111] viXra:1208.0123 [pdf] submitted on 2012-08-18 21:59:26

Non-Destructive Quality Analysis of Indian Gujarat-17 Oryza Sativa SSP Indica(rice) Using Image Processing

Authors: Chetna V. Maheshwari, Kavindra R. Jain, Chintan K. Modi
Comments: 7 Pages.

The Agricultural industry on the whole is ancient so far. Quality assessment of grains is a very big challenge since time immemorial. The paper presents a solution for quality evaluation and grading of Rice industry using computer vision and image processing. In this paper basic problem of rice industry for quality assessment is defined which is traditionally done manually by human inspector. Machine vision provides one alternative for an automated, non-destructive and cost-effective technique. With the help of proposed method for solution of quality assessment via computer vision, image analysis and processing there is a high degree of quality achieved as compared to human vision inspection. This paper proposes a new method for counting the number of Oryza sativa L (rice seeds) with long seeds as well as small seeds using image processing with a high degree of quality and then quantify the same for the rice seeds based on combined measurements.
Category: Digital Signal Processing

[110] viXra:1208.0122 [pdf] submitted on 2012-08-18 22:00:24

Trip Distribution Model for Delhi Urban Area Using Genetic Algorithm

Authors: Shivendra Goel, J.B. Singh, Ashok K. Sinha
Comments: 8 Pages.

As the society evolves it generates transport demand. An estimate of the volume of trips between zones is a necessary step in transportation studies. The classical transportation planning methods are based on simple extrapolation of trends. Some mathematical models like linear programming models have also been used by researchers for estimating traffic generation for future period. This paper presents a model for trip distribution in Delhi Urban Area using Genetic Algorithm. This model has been used for trip distribution in all zones of Delhi Urban Area. This model is applied on the real set of data on passengers trips generated and passengers trips attracted in all zones of Delhi Urban Area, which in turn gives satisfactory results which can be applicable in current and future scenarios. This work analyzes and compares the result of this model with Linear programming model for trip distribution.
Category: Digital Signal Processing

[109] viXra:1208.0121 [pdf] submitted on 2012-08-18 22:01:30

Design and Automation of Security Management System for Industries Based On M2M Technology

Authors: Swathi Bhupatiraju, T J V Subrahmanyeswara Rao
Comments: 8 Pages.

Security Management at the industries is very important requirement, especially at night times. Security people always needed to monitor the industries sites. Somebody should present always in onsite to protect industry from thievery. In this paper an idea of protecting industry is by automating industry security system is presented. The idea developed using ARM processor which is an industry leading embedded processor and a GUI is developed which is very useful for remote monitoring and collecting information. To attain reliability the improvement of the security level with the help of GSM based wireless technology which consists of transmitter (GSM modem) at the site location and receiver is the GSM mobile. Information transmitted by the GSM modem at the plant location will be sent to the respective person’s mobile as a text message.
Category: Digital Signal Processing

[108] viXra:1208.0120 [pdf] submitted on 2012-08-18 22:02:27

A Study on Digital Forensics Standard Operation Procedure for Wireless Cybercrime

Authors: Yun-Sheng Yen, I-Long Lin, Annie Chang
Comments: 14 Pages.

With the increases growth of Internet users and the development of new technologies, current legislations and security has faced difficulties in trying to keep up, hence cybercrime numbers are increased rapidly. Although wireless cybercrime is a new threat, but with the detail tracking and investigation effort, ultimately the professional will find some kind of digital evidence, and its often required to identify and preserve in order to be recognized and restoring the truth, hence the establishment of trainings in digital evidence through to forensic requires immediate implementation, with increase of the standards and knowledge it will strength the competence and credibility of the forensics unit’s professional ability in assistance to fight again all crime.
Category: Digital Signal Processing

[107] viXra:1208.0119 [pdf] submitted on 2012-08-18 22:03:30

Formalization of the Wolof with Nooj: Implementation on the Wolof Dictionary

Authors: Haby Diallo, Alex Corenthin, Claude Lishou
Comments: 17 Pages.

This paper introduces the NooJ module for the Wolof language and the implemented electronic dictionary. the linguistic resources used are common usage dictionaries including Arame Fall’s and Jean Léopold Diouf’s dictionaries, as well as the Wolof lexicon available at CLAD (Centre de linguistique Appliquée de Dakar). The present work will first focus on the socio-linguistic situation of the Wolof language while describing its alphabet before, in a second part, introducing the complex morphology of this language. The third part will be devoted to explaining how the core of the dictionary has been constructed and describing the flexional and derivation rules used to implement it in NooJ. Finally, the first results achieved with NooJ will be presented.
Category: Digital Signal Processing

[106] viXra:1208.0118 [pdf] submitted on 2012-08-18 22:04:29

Non Destructive Quality Evaluation of Nicotiana Tabacum Using Off-Line Machine Vision

Authors: Latesh N. Patel, Kavindra R. Jain, Hitesh B. Shah, Chintan K. Modi
Comments: 7 Pages.

A number of techniques have been proposed in the past for automatic quality evaluations of pre-processed tobacco using image processing. Although some studies have aimed to evaluate the quality of processed tobacco. There is no automatic system that is capable of evaluating the processed tobacco. This paper proposes a new method for counting the number of normal Chewing tobacco (Nicotiana tabacum) as well as foreign elements using machine vision. By proposed method a quality evaluation of processed chewing tobacco can be done which would be very beneficial for the purpose of its quality which is ready to be eat product.
Category: Digital Signal Processing

[105] viXra:1208.0117 [pdf] submitted on 2012-08-18 22:05:25

Mapclim System :Early Warning Mechanisms to Climate Change in Africa

Authors: Alassane Diop, Antoine Goundjo
Comments: 11 Pages.

This work is carried out within the framework of the implementation of innovative technological applications to facilitate the resolution of development issues raised by climate changes. It describes the architecture used for the design of MapClim, a free system of access intended for the collection, systematic monitoring and diffusion of the data with the creation of capacities of community and national answer. Thus, it enables to establish a permanent waking state and to take fast and efficient actions in favour of the adaptation to the climate change. Our article, in the light of data, indicators and process charts, presents the results on the effects of the climate change in West Africa.
Category: Digital Signal Processing

[104] viXra:1208.0116 [pdf] submitted on 2012-08-19 00:03:54

Future Research Directions in Skyline Computation

Authors: R.D. Kulkarni, B.F. Momin
Comments: 11 Pages.

The purpose of this paper is to put focus on the current trends in the area of skyline computation which is a computational trend related to database research community and also to highlight the further research directions in the same. In recent years, the skyline queries or skyline computations have been used in many advanced applications such as location based queries originated from the mobile phones. Although the concept of skyline computation originated from centralized database environment, as the application needs grew, the concept has been applied successfully to modern computational environments like distributed networks, real time systems, mobile ad hoc networks etc. This paper aims at unfolding the various recent methodologies used till now, for the skyline computation and then highlights the future research directions in this area.
Category: Digital Signal Processing

[103] viXra:1208.0115 [pdf] submitted on 2012-08-19 00:04:55

Information Retrieval in Intelligent Systems: Current Scenario & Issues

Authors: Sudhir Ahuja, Rinkaj Goyal
Comments: 8 Pages.

Web space is the huge repository of data. Everyday lots of new information get added to this web space. The more the information, more is demand for tools to access that information. Answering users’ queries about the online information intelligently is one of the great challenges in information retrieval in intelligent systems. In this paper, we will start with the brief introduction on information retrieval and intelligent systems and explain how swoogle, the semantic search engine, uses its algorithms and techniques to search for the desired contents in the web. We then continue with the clustering technique that is used to group the similar things together and discuss the machine learning technique called Self-organizing maps [6] or SOM, which is a data visualization technique that reduces the dimensions of data through the use of self-organizing neural networks. We then discuss how SOM is used to visualize the contents of the data, by following some lines of algorithm, in the form of maps. So, we could say that websites or machines can be used to retrieve the information that what exactly users want from them.
Category: Digital Signal Processing

[102] viXra:1208.0114 [pdf] submitted on 2012-08-19 00:05:53

Encoding Time Reduction Method for the Wavelet Based Fractal Image Compression

Authors: Jyoti Bhola, Simarpreet Kaur
Comments: 8 Pages.

In this paper we show the two implementations of fractal (Pure-fractal and Wavelet fractal image compression algorithms) which have been applied on the images in order to investigate the compression ratio and corresponding quality of the images using peak signal to noise ratio (PSNR). And in this paper we also set the threshold value for reducing the redundancy of domain blocks and range blocks, and then to search and match. By this, we can largely reduce the computing time. In this paper we also try to achieve the best threshold value at which we can achieve optimum encoding time
Category: Digital Signal Processing

[101] viXra:1208.0113 [pdf] submitted on 2012-08-19 00:07:02

Depth-Wise Segmentation of 3D Images Using Entropy

Authors: S. S. Mirkamali, P. Nagabhushan
Comments: 9 Pages.

Recent advances in 3D modeling and depth estimation of objects have created many opportunities for multimedia computing. Using depth information of a scene enables us to propose a brand new segmentation method called Depth-Wise segmentation. Unlike the conventional image segmentation problems which deal with surface-wise decomposition, the depth-wise segmentation is a problem of slicing an image containing 3D objects in a depth-wise sequence. The proposed method uses entropy of a depth image to characterize the edges of objects in a scene. Later, obtained edges are used to find Line-Segments. By linking the line-segments based on their object and layer numbers, Objects-Layers are achieved. To test the proposed segmentation algorithm, we use syntactic images of some 3D scenes and their depth maps. The experiment results show that our method gives good performance.
Category: Digital Signal Processing

[100] viXra:1208.0112 [pdf] submitted on 2012-08-19 00:08:13

Single Machine Scheduling Problem under Fuzzy Processing Time and Fuzzy Due Dates

Authors: Vikas S. Jadhav, V. H. Bajaj
Comments: 8 Pages.

In this paper, we propose n-jobs to be processed on Single Machine Scheduling Problem (SMSP) involving fuzzy processing time and fuzzy due dates. The different due dates for each job be considered which meet the demand of customer with more satisfaction level. The main objective of this paper is the total penalty cost to be minimum in the schedule of the jobs on the single machine. This cost is composed of the total earliness and the total tardiness cost. Here, an algorithm is developed using Average High Ranking Method (AHRM) which minimizes the total penalty cost due to earliness (lateness) of jobs in fuzzy environment. Finally, numerical example is given to illustrate proposed method.
Category: Digital Signal Processing

[99] viXra:1208.0111 [pdf] submitted on 2012-08-19 00:09:18

An Analysis of Packet Fragmentation Attacks vs. Snort Intrusion Detection System

Authors: Tian Fu, Te-Shun Chou
Comments: 12 Pages.

When Internet Protocol (IP) packets travel across networks, they must meet size requirements defined in the network’s Maximum Transmission Unit (MTU). If the packet is larger than the defined MTU, then it must be divided into smaller pieces, which are known as fragments. Attackers can exploit this process for their own purposes by attacking the systems. Packet fragmentation attacks have caused problems for Intrusion Detection Systems (IDSs) for years. In this paper, Snort IDS was tested. VMware virtual machines were used as both the host and victim. Other tools were also implemented in order to generate attacks against the IDS. The experiment results show the performance of Snort IDS when it was being attacked, and the ability of Snort to detect attacks in different ways.
Category: Digital Signal Processing

[98] viXra:1208.0110 [pdf] submitted on 2012-08-19 00:10:25

Fabric Inspection System using Artificial Neural Networks

Authors: P. Banumathi, G. M. Nasira
Comments: 8 Pages.

Fabric inspection system is important to maintain the quality of fabric. Fabric inspection is carried out manually with human visual inspection for a long time. The work of inspectors is very tedious and consumes time and cost.To reduce the wastage of cost and time, automatic fabric inspection is required. This paper proposes an approach to recognize fabric defects in textile industry for minimizing production cost and time. The Fabric inspection system first acquires high quality vibration free images of the fabric. Then the acquired images are subjected to defect segmentation algorithm. The output of the processed image is used as an input to the Artificial Neural Network (ANN) which uses back propagation algorithm to calculate the weighted factors and generates the desired classification of defects as an output.
Category: Artificial Intelligence

[97] viXra:1208.0109 [pdf] submitted on 2012-08-19 00:11:50

Experience on re-Engineering Applying with Software Product Line

Authors: Waraporn Jirapanthong
Comments: 10 Pages.

In this paper, we present our experience based on a reengineering project. The software project is to re-engineer the original system of a company to answer the new requirements and changed business functions. Reengineering is a process that involves not only the software system, but also underlying business model. Particularly, the new business model is designed along with new technologies to support the new system. This paper presents our experience that applies with software product line approach to develop the new system supporting original business functions and new ones.
Category: Artificial Intelligence

[96] viXra:1208.0108 [pdf] submitted on 2012-08-19 00:16:35

Tea Insect Pests Classification Based on Artificial Neural Networks

Authors: R. K. Samanta, Indrajit Ghosh
Comments: 13 Pages.

Tea is one of the major health drinks of our society. It is a perennial crop in India and other countries. One of the production barriers of tea is insect pests. This paper presents an automatic diagnosis system for detecting tea insect pests based on artificial neural networks. We apply correlation-based feature selection (CFS) and incremental back propagation network (IBPLN). This is applied on a new database created by the authors based on the records of tea gardens of North Bengal Districts of India. We compare classification results with reduction of dimension and without reduction of dimension. The correct classification rate of the proposed system is 100% in both the cases.
Category: Artificial Intelligence

[95] viXra:1208.0107 [pdf] submitted on 2012-08-19 00:19:13

Simulation Study For Performance Comparison in Hierarchical Network With CHG Approach in MANET

Authors: Anzar ahamd, R. Gowri, S.C.Gupta
Comments: 14 Pages.

The implementation of MANET for commercial purposes is not an easy task. Unlike other wireless technologies such as cellular networks, MANET face more difficult problems concerning management functions, routing and scalability . As a solution to these complications, clustering schemes are proposed for MANET in order to organize the network topology in a hierarchical manner. Many clustering techniques have been developed. Clustering is a method which aggregates nodes into groups . These groups are contained by the network and they are known as clusters. By Increasing network capacity and reducing the routing overhead through clustering brings more efficiency and effectiveness to scalability in relation to node numbers and the necessity for high mobility. The manager node,in clustering has responsibility for many functions such as cluster maintenance, routing table updates, and the discovery of new routes within the network. The other node named as gateway node communicate to the other cluster . In this paper we remove the cluster head (CH) and given a new approach in which cluster head and gateway will be same and that node is known as cluster head gateway (CHG) ,in which all the responsibilities of cluster head and gateway will be perform by the Cluster head gateway(CHG) itself . By applying this approach we reduce of overheads and improve the over all performance of the network while throughput will be same in both condition with the help of Exata simulation.
Category: Digital Signal Processing

[94] viXra:1208.0106 [pdf] submitted on 2012-08-19 00:22:16

Object Detection in a Fixed Frame

Authors: Debjyoti Bagchi, Dipaneeta Roy Chowdhury, Satarupa Das
Comments: 9 Pages.

This paper presents an approach to the detection of objects in a fixed frame. In this approach, an acoustic sensor is used for detecting any change in the surrounding of a fixed frame, i.e. detecting the entry of a foreign object. If it so happens, then the sensor sends a signal to the system which is processed. If the sampling result crosses the fixed threshold then the camera is switched on. It clicks a snapshot and then again is switched off. With this snapshot, we operate frame difference against an initially stored snapshot of the fixed frame. Based on the difference, we determine the nature and type of the foreign object.
Category: Digital Signal Processing

[93] viXra:1208.0105 [pdf] submitted on 2012-08-19 00:23:26

Semantic web based Sensor Planning Services (SPS) for Sensor Web Enablement (SWE)

Authors: P.Udayakumar, M.Indhumathi
Comments: 15 Pages.

The Sensor Planning Service (SPS) is service model to define the web service interface for requesting user driven acquisitions and observation. It’s defined by the Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) group to provide standardized interface for tasking sensors to allow to defining, checking, modifying and cancelling tasks of sensor and sensor data. The goal of Sensor Planning Service (SPS) of OGC – SWE is standardize the interoperability between a client and a server collection management environment. The Sensor Planning Service (SPS) is need to automate complex data flow in a large enterprises that are depend on live & stored data from sensors and multimedia equipment. The obstacle are faced in Sensor Planning Service (SPS) are (I) Observation from sensor at the right time and right place will be problem, (II) acquisition information(data) that are collected at a specific time and specific place will be problem. The above two obstacle are accomplished and obtained by the web based semantic technology in order to provide & apply the ontology based semantic rule to user driven a acquisitions and observation of Sensor Planning Service (SPS). The novelty of our approach is by adding the semantic rule to Sensor Planning Service model in SWE and we implemented Sensor Planning Service (SPS) with semantic knowledge based to achieve high standardized service model for Sensor Planning Service (SPS) of OGC – SWE.
Category: Digital Signal Processing

[92] viXra:1208.0104 [pdf] submitted on 2012-08-19 00:24:39

High Reliable Secure Data Collection Using Complexity Exchanging Code Key Method Addressing Protocol in Wireless Sensor Network

Authors: V.jayaraj, M.Indhumathi
Comments: 12 Pages.

A Wireless Sensor Network (WSN) is emerging field in Information and communication technology. In WSN data transmission and data collection are unsecure because of sensor node Incompatibility. So providing security to Sensor network is very important. The key based Mechanism is secure data collection and it’s mainly used to guarantee data confidentiality. Range pair wise key is largely used due to the necessary of data encryption and decryption between each pair range of communication node. Fixed key mechanism difficult for the attacker to detect the regularity of the randomly generated key chain function in privacy homomorphism (PH).PH means no intermediate node to encrypt and decrypt only direct collect and aggregate data for encryption and decryption. It is special key based scheme. It’s totally based on beta distribution and some statistical tool using key code method by using Complexity exchanging code key method addressing protocol. We show how to reduce significant attacks and secure data collection on wireless sensor network.
Category: Digital Signal Processing

[91] viXra:1208.0103 [pdf] submitted on 2012-08-19 00:36:00

Performance Imrovement of a Navigataion System Using Partial Reconfiguration

Authors: S.S.Shriramwar, N.K.Choudhari
Comments: 8 Pages.

Dynamic Partial Reconfiguration (DPR) of FPGAs presents many opportunities for application design flexibility, enabling tasks to dynamically swap in and out of the FPGA without entire system interruption. In this thesis, we have implemented a line follower robot for the white line as well as for black line, both these modules are programmed in VHDL. The robot are made to run for white line and it will dynamically reconfigure the FPGA in the run-time for the black line or vice-versa. This design includes two modules one is static and the other is partially reconfigurable regions (PRR) which is a dynamic region. The controllers are the static modules used for controlling the flow of data to and from the reconfigurable modules to the external world (host environment) through busmacros. Whereas white line and black line modules are designed as dynamic modules.
Category: Digital Signal Processing

[90] viXra:1208.0102 [pdf] submitted on 2012-08-19 00:37:09

Enhancing Data Security in Medical Information System Using the Watermarking Techniques and Oracle Secure File LOBs

Authors: Said Aminzou, Brahim ER-RAHA, Youness Idrissi Khamlichi, Mustapha Machkour, Karim Afdel
Comments: 10 Pages.

In this paper, we propose an efficient digital watermarking scheme to strengthen the security level already present in the database management system and to avoid illegal access to comprehensive content of database including patient‘s information. Doctors diagnose medical images by seeing Region of Interest (ROI). A ROI of a medical image is an area including important information and must be stored without any distortion. If a medical image is illegally obtained or if its content is changed, it may lead to wrong diagnosis. We substitute the part out of ROI of LSB bitplane of the image with the patient data and a binary feature map. This latter is obtained by extracting edges of the resized image to the quarter of its size using Laplacian operator. This image is directly integrated into the database. The edge map and invariant moments are used to check the integrity of the image.
Category: Digital Signal Processing

[89] viXra:1208.0101 [pdf] submitted on 2012-08-19 01:48:24

Building a Prototype Prepaid Electricity Metering System Based on RFID

Authors: Fawzi Al-Naima, Bahaa Jalil
Comments: 17 Pages.

The prepaid meter is important in making the consumer having sense about his/her energy consumption which is important in eliminating the difficulties facing the electrical utility employee in getting the reading of the conventional electromechanical meter and eliminating any error incurred in bills issuing. This paper is aimed at developing a prototype of a management system for a prepaid electrical power meter. The designed prepaid meter consists of an RFID reader, a microcontroller, a digital meter and a wireless gateway. The proposed prototype metering system consists of two parts: clients and server. An RFID reader is used to read the ID of the credit card and a PC connected to a hardware simulated circuit which is designed and implemented to simulate the operation of the digital meter. The server is located in the local substation which receives the card’s ID from clients and sends ID’s information back to the client after checking and/or updating the database.
Category: General Science and Philosophy

[88] viXra:1208.0100 [pdf] submitted on 2012-08-19 01:49:41

Modified TCP Peach Protocol for Satellite based Networks

Authors: Mohanchur Sarkar, K.K.Shukla, K.S.Dasgupta
Comments: 20 Pages.

TCP has become the de-facto protocol standard for congestion control in the existing terrestrial Internet. But it was designed keeping in mind low Round Trip Time and low channel error rates typically experienced in a wired network. In this paper we have considered TCP Protocol variants like Tahoe, Reno, New Reno, SACK, FACK, Vegas, Westwood and Peach. TCP Peach is found to be better than the other TCP Protocol variants in case of satellite based networks but its performance also degrades when the packet error probability is high and in cases where there are multiple packet losses in the same window. In this paper a modification has been suggested to the existing PEACH protocol and the modified PEACH Protocol is tested to provide performance improvement especially in the cases where the packet error rate is very high. The modified Peach Protocol has been implemented in the ns2 Simulator and evaluated considering a Geo Satellite Network with varying channel conditions with all other TCP variants.
Category: Digital Signal Processing

[87] viXra:1208.0099 [pdf] submitted on 2012-08-19 01:50:28

TSRD-RL Algorithm Based Secured Route Discovery for Manet with Improved Route Lifetime

Authors: S. Priyadarsini
Comments: 16 Pages.

Ad hoc network is collections of wireless mobile devices with limited broadcast range and resources, and no fixed infrastructure. The critical issue for routing in mobile ad hoc network is how to discover a secured path with longest route lifetime and also with minimum node computation. The mobility nature, power constraint of the node and the security attacks of malicious nodes cause frequent path failure. This path failure causes frequent route discovery which affects both the routing protocol performance as well as the node computation overhead. So we propose an efficient Trust based Multipath Route Discovery with improved Route Lifetime algorithm to provide trust based solution for the Security attacks which affects the routing protocol performance. We implement the proposed algorithm in AODV and the performance is evaluated. Our protocol improves the network performance and reduces the computation overhead by avoiding frequent route discovery since we select secured stable multi paths with longest life time. With the help of network simulator we can prove that our proposed protocol performs better than the existing stability-based routing protocols with improved packet delivery ratio.
Category: Digital Signal Processing

[86] viXra:1208.0098 [pdf] submitted on 2012-08-19 01:51:18

A Novel Fuzzy Logic Solar (PV) - Grid/DG Powered Pump Controller for Efficient Water Management

Authors: San deep, Sushil Kumar Singh, P. Aditya Vardhan, Prasun Anand, S.N. Singh
Comments: 8 Pages.

Water-management is an interdisciplinary field concerned with the management and optimum utilization of water. The consumption of water is increasing due to rapid growth of population. Further due to ecological changes, the open reservoir systems water storage and its availability is shrinking every year. Thus underground source of water has become the major source of water pumped through submersible pump at domestic end. On the other end, excessive and continuous use of water through this pump may lead to depletion of water table. In order to avoid this, an optimal control system for water-management has been proposed in the present project work. This has been achieved by control ling the operational time of a solar water pump in an optimal way. An efficient fuzzy model has been proposed in simulation environment to develop such an adaptive control system and its implementation in a residential building has been studied.
Category: Digital Signal Processing

[85] viXra:1208.0097 [pdf] submitted on 2012-08-19 01:52:17

Design and Development of Simple Low Cost Rectangular Microstrip Antenna for Multiband Operation

Authors: Nagraj Kulkarni, S. N. Mulgi, S. K. Satnoor
Comments: 7 Pages.

This paper presents the design and development of simple low cost rectangular microstrip antenna for multiband operation. By incorporating U-slot of optimum geometry and open stubs at two opposite corners on the rectangular radiating patch the antenna operates between 1.815 to 9.01 GHz at different frequency bands with a virtual size reduction of 44.84% and gives broadside radiation characteristics at each operating band. By placing U-slot and open stubs at the corners, the copper area of the patch is reduced to 8.50%, when compared to the copper area of conventional rectangular microstrip antenna designed for the same frequency. The experimental and simulated results are in good agreement with each other. The design concept of antenna IS given and experimental results are discussed. The proposed antenna may find applications in mobile, WLAN, WiMax and SAR.
Category: Digital Signal Processing

[84] viXra:1208.0096 [pdf] submitted on 2012-08-19 01:53:22

Concealing Encrypted Messages using DCT in JPEG Images

Authors: Rita Chhikara, Sunil Kumar
Comments: 4 Pages.

Steganography is an important area of research in recent years involving a number of applications. It is the science of embedding information into the cover image viz., text, video, and image (payload) without causing statistically significant modification to the cover image. The modern secure image steganography presents a challenging task of transferring the embedded information to the destination without being detected. In this paper we present an image based steganography that combines Discrete Cosine Transform (DCT), and compression techniques with LSB techniques on raw images to enhance the security of the payload. Initially the cover-image is transformed from spatial domain to the frequency domain using DCT. The image is then quantized. and LSB technique is used to insert in pixels specified according to a range.
Category: Digital Signal Processing

[83] viXra:1208.0095 [pdf] submitted on 2012-08-19 01:54:10

A New Current-Mode Sigma Delta Modulator

Authors: Ebrahim Farshidi
Comments: 10 Pages.

In this paper, a alternative structure method for continuous time sigma delta modulator is presented. In this modulator for implementation of integrators in loop filter second generation current conveyors are employed. The modulator is designed in CMOS technology and features low power consumption (<2.8mW), low supply voltage (±1.65), wide dynamic range (>65db), and with 180khZ bandwidth. Simulation results confirm that this design is suitable for data converters.
Category: Digital Signal Processing

[82] viXra:1208.0094 [pdf] submitted on 2012-08-19 01:56:48

Diode Bridge T-type LC Reactor as Transformer Inrush Current Limiter

Authors: Mohd Wazir Mustafa, Leong Bee Keoh, Sazali P. Abdul Karim
Comments: 10 Pages.

An improvement inrush current limiter (ICL) is presented to form a highly effective inrush filter which further reduce the inrush current peaks and ripple voltage. The proposed ICL is constructed using a diode bridge and T-type LC reactor consisting of two inductors and one capacitor. The proposed ICL is connected at each phase of the primary winding of three phase transformer. The line current is rectified by the diode bridge and the increase in inrush current is limited by the T-type LC reactor. The proposed ICL is effective in reducing the peak inrush current by 80% more compared to single inductor type reactor. The potential of resonance take place in T-type reactor is completely avoided since the LC filter resonant frequency is very low while the lowest frequency the LC filter will ever see is well above is resonant point. The proposed ICL is numerically tested using simulation package PSCAD.
Category: Digital Signal Processing

[81] viXra:1208.0093 [pdf] submitted on 2012-08-19 01:58:06

Cascaded Multilevel Inverter for Photovoltaic Systems With PI Control

Authors: Shantanu Chatterjee, S Mallika
Comments: 15 Pages.

Use of renewable Energy such as solar, wind, water, nuclear energy etc is of huge demand in recent era. Converters used in Photovoltaic Systems have two stages: a boost up chopper and a PWM inverter. But this integration of these two stages have certain problems like decrease in efficiency, problem in interaction between the stages and also problem in MPPT(i.e Maximum Power Point Tracking) etc. So the Total energy which is produced is not utilized fully rather only a part of it can be utilized fruitfully. So in this paper we have proposed a novel use of Cascaded H-Bridge Multilevel Inverter Tropology in integrations with PV module controlled by a PI controller. The usefulness of this tropology is in the use of fully controlled PI controller and the increase in system performance. All the Gate Pulses to the IGBT switches are provided by the FPGA (Field Programmable Gate Array) kit. The detailed discussion and performance of the whole system alone with the block diagram and basic designing is discussed in this paper. Simulated Results and comparison is also done in this paper. This System provide a results in more efficiency and specially in low or medium power application. Detailed harmonic discussion is also done here in this paper.
Category: Digital Signal Processing

[80] viXra:1208.0092 [pdf] submitted on 2012-08-19 01:59:02

Performance Analysis of Extended Channel Models for MIMO LTE SC-FDMA Uplink Systems

Authors: P.Balasundaram, Nilkantha Chakraborty
Comments: 11 Pages.

Long Term Evolution (LTE) is the latest standard proposed by the 3GPP towards 4G. This paper uses an accurate perform analysis of complete uplink LTE study. Multiple input multiple output (MIMO) techniques have gathered much attention in recent years as a way to improve drastically the performance of wireless mobile communications. Simulated Results compare different channel estimation techniques showing significant difference among them, such as Extended Pedestrian A, Extended Vehicular A, Extended typical urban , High Speed Train Condition(HSTC) are considered and the measurement data is compared for the Localized and Distributed SC-FDMA mapping methods. From the results, we find that Localized SC-FDMA outperforms Distributed SC-FDMA in terms of SNR and BER .
Category: Digital Signal Processing

[79] viXra:1208.0091 [pdf] submitted on 2012-08-19 02:01:58

Ber Performance of Ofdm System with 16-Qam and Varying Length of Guard Interval

Authors: Amandeep Singh Sappal, Parneet Kaur
Comments: 8 Pages.

Orthogonal frequency division multiplexing scheme being spectrally efficient is used in modern communication systems. To achieve error free communication guard interval is inserted using cyclic prefix and zero padding. The Bit Error Rate (BER) performance for orthogonal frequency division multiplexing (OFDM) system with 16-QAM and varying length of guard interval (GI) is presented.
Category: Digital Signal Processing

[78] viXra:1208.0089 [pdf] submitted on 2012-08-19 02:03:38

Comparison of Sensitivity and Nonlinear Optimization Methods for Transmission Network LTCs Setting

Authors: F. Karbalaei, M. Ranjbar, M. Kavyani
Comments: 6 Pages.

This paper compares the sensitivity method with a proposed nonlinear optimization method for setting of transmission network load tap changers (LTCs) as a preventive action for voltage instability. The aim of preventive actions is to increase voltage stability margin. In contrast to emergency actions, preventive ones implemented when the power system is stable. Thus, in calculation of a preventive action, in addition to increase stability margin, its effects on current operating point of the system must be considered. The sensitivity method is a linearized based method that uses the sensitivity of the loadability margin with respect to tap values. In the proposed optimization method, the tap values are calculated using optimal powerflow model. Two groups of variables are used in optimization problem: one group is related to base case (current operating point) and the other is related to the voltage stability boundary. By this work, the preventive actions do not cause undesirable changes in the system current variables.
Category: Digital Signal Processing

[77] viXra:1208.0088 [pdf] submitted on 2012-08-19 02:05:12

Real Time Hand Gesture Recognition Using Sift

Authors: Pallavi Gurjal, Kiran Kunnur
Comments: 15 Pages.

The objective of the gesture recognition is to identify and distinguish the human gestures and utilizes these identified gestures for applications in specific domain. In this paper we propose a new approach to build a real time system to identify the standard gesture given by American Sign Language, or ASL, the dominant sign language of Deaf Americans, including deaf communities in the United States, in the English-speaking parts of Canada, and in some regions of Mexico. We propose a new method of improvised scale invariant feature transform (SIFT) and use the same to extract the features. The objective of the paper is to decode a gesture video into the appropriate alphabets.
Category: Digital Signal Processing

[76] viXra:1208.0087 [pdf] submitted on 2012-08-19 02:06:06

Numerical Modeling of Series Resistance of Millimeter- wave DDR IMPATTs

Authors: Aritra Acharyya1, J. P. Banerjee
Comments: 10 Pages.

This paper describes a computer-based method to calculate the parasitic positive series resistance of millimeter-wave packaged DDR IMPATT devices from high-frequency small-signal conductance-susceptance characteristics. The series resistance of the device can be obtained at the threshold condition when the small-signal conductance of the packaged device just becomes negative and the susceptance becomes positive. Series resistance values are determined for two DDR Silicon IMPATT diodes designed to operate at W-band near 94 GHz window frequency using the method developed by the author’s.
Category: Digital Signal Processing

[75] viXra:1208.0086 [pdf] submitted on 2012-08-19 02:08:46

Capacity Analysis of Adaptive Transmission Techniques over TWDP Fading Channel

Authors: Bhargabjyoti Saikia, Rupaban Subadar
Comments: 11 Pages.

Expressions for the single-user capacity have been presented for different power and rate adaptive transmission techniques over Two Wave Diffuse Power [TWDP] fading channels. Different power and rate adaptation techniques available in the literature have been considered in these analyses. A study of the effect of fading parameters on the channel capacity of different techniques has been presented. Also the results have been verified against the known special case results.
Category: Digital Signal Processing

[74] viXra:1208.0085 [pdf] submitted on 2012-08-19 02:14:17

Design of DE Optimized SSSC-Based Facts Controller

Authors: S.C.Swain, Srikanta Mahapatra, Sidhartha Panda, Susmita Panda
Comments: 16 Pages.

Power-system stability improvement by a static synchronous series compensator (SSSC)-based controller is studied in this paper. Conventionally, the lead-lag structure was used to modulate the injected voltage. But, in this paper PI, PID, PIDD structures are proposed to modulate the injected voltage. The design problem of the proposed controller is formulated as an optimization problem and Differential Evolution Algorithm is used to find out the optimal controller parameters. Different disturbances are applied to the single-machine infinite bus system and multi-machine infinite bus system and the performances of the conventional structure and the proposed structure are evaluated and compared. Only remote signals with required time delays are taken into account in this paper. The simulation results are presented and compared with a modern heuristic optimization technique under various loading conditions and disturbances to show the effectiveness of the proposed approach.
Category: Digital Signal Processing

[73] viXra:1208.0084 [pdf] submitted on 2012-08-19 02:15:35

A New Algorithm for TCSC- Based Controller Design by Using Differential Evolution Method

Authors: A.K.Baliarsingh, N.R.Samal, D.P.Dash, S.Panda
Comments: 8 Pages.

Design of an optimal controller requires the optimization of differential evolution performance measures that are often no commensurable and competing with each other. Being a population based approach; Differential Evolution (DE) is well suited to solve designing problem of TCSC – based controller. This paper investigates the application of DE-based multi-objective optimization technique for the design of a Thyristor Controlled Series Compensator (TCSC)-based supplementary damping controller. The designing objective is to improve the power system stability with minimum control effort. The proposed technique is applied to generate Pareto set of global optimal solutions to the given multi-objective optimization problem. Further, a fuzzy-based membership value assignment method is employed to choose the best compromise solution from the obtained Pareto solution set. Simulation results are presented to show the effectiveness and robustness of the proposed approach.
Category: Digital Signal Processing

[72] viXra:1208.0083 [pdf] submitted on 2012-08-19 02:16:22

Hybrid Algorithm for Detection of High Impedance Arcing Fault in Overhead Transmission System

Authors: Abdulhamid.A. Abohagar, Mohd.Wazir.Mustafa, Nasir A. Al-geelani
Comments: 18 Pages.

High impedance fault (HIF) is a long standing problem which known by very complex phenomena, because of its distinctive characteristics asymmetry and nonlinearity behavior. Besides that, arc is the most serious problem which is mostly associated with high impedance fault, this arc is considered as a source of human life risks and fire hazardous, and additionally result in property damage. From the point of few, detection and discrimination of high impedance fault still remain challenging of protection engineers. In this paper new of high impedance model is introduced and combination of wavelet transform and neural network is presented to detect high impedance fault. Discrete wavelet transform (DWT) is used as feature extraction which extracts useful information from distorted current signal that generated from transmission system network under effect of high impedance fault. In order to improve training convergence and to reduce the number of input to neural network, coefficients of wavelet is calculated and used as input for training back propagation neural network. Multi-layer back propagation neural network (BP-NN) is used as classifier for high impedance fault and discriminate it from such events like capacitor switching and load switching etc.
Category: Digital Signal Processing

[71] viXra:1208.0082 [pdf] submitted on 2012-08-18 08:58:40

A Cryptosystem for XML Documents

Authors: A. A. Abd EL-Aziz, A.kannan
Comments: 3 Pages.

In this paper, we propose a cryptosystem (encrypting/decryption) for XML data using Vigenere cipher algorithm and EL Gamal cryptosystem. Such a system is designed to achieve some of security aspects such as confidentiality, authentication, integrity, and non-repudiation. We used XML data as an experimental work. Since, we have used Vigenere cipher which is not monoalphabetic, then the number of possible keywords of length m in a Vigenere Cipher is 26 m, so even for relatively small values of m, an exhaustive key search would require a long time.
Category: Digital Signal Processing

[70] viXra:1208.0081 [pdf] submitted on 2012-08-18 09:00:01

Interband Alias-Free Subband Adaptive Filtering with Critical Sampling

Authors: K.Sreedhar
Comments: 10 Pages.

Adaptive Filtering is an important concept in the field of signal processing and has numerous applications in fields such as speech processing and communications. An Adaptive filter is a filter that self-adjusts its transfer function according to an optimizing algorithm. Because of the complexity of the optimizing algorithms, most adaptive filters are digital filters that perform digital signal processing and adapt their performance based on the input signal. An adaptive filter is often employed in an environment of unknown Statistics for various purposes such as system identification, inverse modeling for channel equalization, adaptive prediction, and interference canceling. Knowing nothing about the environment, the filter is initially set to an arbitrary condition and updated in a step-by-step manner toward an optimum filter setting. For updating, the least mean-square algorithm is often used for its simplicity and robust performance. However, the LMS algorithm exhibits slow convergence when used with an ill-conditioned input such as speech and requires a high computational cost, especially when the system to be identified has a long impulse response. To overcome the limitations of a conventional full band adaptive filtering, various sub band adaptive filtering (SAF) structures have been proposed. Properly designed, an SAF will converge faster at a lower computational cost than a full band structure. However, its design should consider the following two facts: the inter band aliasing introduced by the down sampling process degrades its performance, and the filter bank in the SAF introduces additional computational overhead and system delay. In this project, a critically sampled SAF structure that is almost Alias-free is proposed to reap all the benefits of using an SAF. Since the proposed SAF is performed using subbands that is almost alias-free, there is little inter band aliasing error at the output. In each sub band, the inter band aliasing is obtained using a bandwidth-increased linear-phase FIR analysis filter, whose pass band has almost-unit magnitude response in the subband interval, and is then subtracted from the sub band signal. This aliasing cancellation procedure, however, causes the spectral dips of the sub band signals. These spectral dips can be reduced by using a simple FIR filter. Simulations show that the proposed structure converges faster than both an equivalent full band structure at lower computational complexity and recently proposed SAF structures for a colored input. The analysis is done using MATLAB, a language of technical computing, widely used in Research, Engineering and scientific computations.
Category: Digital Signal Processing

[69] viXra:1208.0080 [pdf] submitted on 2012-08-18 09:02:02

Storing XML Documents and XML Policies in Relational Databases

Authors: A. A. Abd EL-Aziz, A.kannan
Comments: 4 Pages.

In this paper, We explore how to support security models for XML documents by using relational databases. Our model based on the model in [6], but we use our algorithm to store the XML documents in relational databases.
Category: Digital Signal Processing

[68] viXra:1208.0079 [pdf] replaced on 2012-08-18 08:00:15

Bounds Upon Graviton Mass, and Making Use of the Difference Between Graviton Propagation Speed and HFGW Transit Speed to Observe Post Newtonian Corrections to Gravitational Potential Fields

Authors: A.W. Beckwith
Comments: 6 Pages. Replacement of document due to fact author pasted in some equations which showed up badly in PDF form.

The author presents a post Newtonian approximation based upon an earlier argument / paper by Clifford Will as to Yukawa revisions of gravitational potentials in part initiated by gravitons with explicit mass dependence in their Compton wave length. Prior work with Clifford Will’s idea was stymied by the application to binary stars and other such astro-physical objects with non useful frequencies topping off as up to 100 Hertz, thereby rendering Yukawa modifications of Gravity due to gravitons effectively an experimental curiosity which was not testable with any known physics equipment
Category: Quantum Gravity and String Theory

[67] viXra:1208.0078 [pdf] submitted on 2012-08-17 20:11:29

Relic High Frequency Gravitational waves from the Big Bang and How to Detect Them

Authors: A.W.Beckwith
Comments: 16 Pages.

Simple concepts paper as related by the author. The main goal aside from the detection of GW from the big bang, if possible, would also be to determine if the re heating phase of the cosmological history of the universe is accompanied by steady state or a dramatic increase in entropy production. As reviewing Roo's (2003) about re heating.
Category: Quantum Gravity and String Theory

[66] viXra:1208.0077 [pdf] submitted on 2012-08-18 02:07:53

Dark Energy in M-Theory

Authors: Jonathan Tooker
Comments: 8 Pages.

Dark Energy is yet to be predicted by any model that stands out in its simplicity as an obvious choice for unified investigative effort. It is widely accepted that a new paradigm is needed to unify the standard cosmological model (SCM) and the minimal standard model (MSM). The purpose of this article is to construct a modified cosmological model (MCM) that predicts dark energy and contains this unity. Following the program of Penrose, geometry rather than differential equations will be the mathematical tool. Analytical methods from loop quantum cosmology (LQC) are examined in the context of the Poincar´e conjecture. The longstanding problem of an external time with which to evolve quantum gravity is resolved. The supernovae and WMAP data are reexamined in this framework. No exotic particles or changes to General Relativity are introduced. The MCM predicts dark energy even in its Newtonian limit while preserving all observational results. In its General Relativistic limit, the MCM describes dark energy as an inverse radial spaghettification process. Observable predictions for the MCM are offered. AdS/CFT correspondence is discussed. The MCM is the 10 dimensional union of de Sitter and anti-de Sitter space and has M-theoretical application to the five string theories which lack a unifying conceptual component. This component unifies gravitation and electromagnetism.
Category: Relativity and Cosmology

[65] viXra:1208.0076 [pdf] replaced on 2012-08-20 22:27:56

Derivation of the Fine Structure Constant

Authors: Jonathan Tooker
Comments: 3 Pages.

An alternate interpretation of Quantum Theory is given. The fine structure constant is derived. An experiment is proposed.
Category: Quantum Gravity and String Theory

[64] viXra:1208.0075 [pdf] submitted on 2012-08-18 03:35:00

Localization of the Energy Density of Gravitational Field in the Model of Nondecelerative Universe

Authors: Miroslav Sukenik, Jozef Sima
Comments: 4 Pages. No comment

There is a paradigm stating that gravitational field is of non-localizable and stationary nature. Contrary, in our model of nondecelerative Universe it is hypothesized that gravitational field is always localizable and nonstationary. This assumption allows localizing its energy density.
Category: Relativity and Cosmology

[63] viXra:1208.0074 [pdf] submitted on 2012-08-17 09:17:38

Will Theory of Everything Save Earth?

Authors: Janko Kokosar
Comments: 11 Pages. Science fiction story. Names in story are taken from science, but they are not or almost not connected with their thoughts and descriptions in story.

The story is a mix of futurology, science fiction, new technical ideas, science, science ideas of author, and philosophy. The style is similar as in the paper of Makela. On an example of crisis of overpopulated human species in future, it is described how to develop a theory of everything, how to give more sense to amateur science, and how reactions of professional science regarding amateur science are too much inexact. Some ideas in the story are predictions of the author or are supported by him, and some are only for the course of story or for setting of special situations for thought experiments. Names in story are taken from science, but they are not or almost not connected with their thoughts and descriptions in story. An exception is James Randi.
Category: Mind Science

[62] viXra:1208.0073 [pdf] submitted on 2012-08-17 10:33:28

The Evolution of Species and Societies Through Three-Level Selection

Authors: Dao Einsnewt
Comments: 134 Pages.

Natural selection is three-level selection in the chronological order of individual, relational (kin), and group selection. The principal bases for cooperation in individual, relational, and group selections are unconditional reciprocity with no pre-condition for individuals, beneficial relatedness derived from caring relation as the turning point deviated from reciprocity, and existential division of labor derived from handicapped individuals as the turning point deviated from relatedness, respectively. In group selection, all individuals are handicapped, and the existence of all individuals is dependent on existential division of labor that overcomes individual handicaps. Group fitness becomes much more important than individual fitness, including the fitness by reciprocity and relatedness. Only few insects (bees, wasps, termites and ants) and humans are in group selection, but they dominate the earth. Individual, relational, and group selections correspond to individualistic, collective, and harmonious social interactions and societies, respectively. Three-level selection is divided into three parts: (1) the three-branch way, (2) the development of the three-branch way, and (3) the Postmodern Period. (1) Three-level selection is based on the three-branch way consisting of the three basic human social lives (interactions): yin, yang, and harmony for feminine collective relation, masculine individualistic achievement, and harmonious cooperation, respectively, derived from neuroscience and psychology. The origin of the human social lives is explained by human evolution. The emergence of the harmonious social life and society occurred during human evolution, including ape evolution and hominid evolution. (2) In the Prehistoric Period, the harmonious social life was evolved to adapt to the small social group in the prehistoric hunter-gatherer society. In the Early Period starting from the Neolithic Revolution, the inevitable large civilized social group of the agricultural-nomad society destroyed the prehistoric harmonious small social group. As a result, the collective society and the individualistic society were formed separately. In the collective society, the state collective religion (Judaism, Islam, Hinduism, and Confucianism) dominated. In the individualistic society, the state individualism (Greek mythology and science) dominated. Later, the harmonious religions (Christianity, Buddhism, and Daoism) emerged. In the Modern Period, the modern mass printing and increased literacy led to communication and understanding among the three branches of human society for the modern three-branch society. (3) In the Postmodern Period, the postmodern economy is divided into individualistic (capitalistic), collective (socialistic), and adaptive (unified) economies. The postmodern unified political system is divided into the partisan unified political system where the political parties represent separately the collective and the individualistic societies and the nonpartisan unified political system where the state represents both societies. The balanced unified education system should follow human development from primarily collective education for childhood to primarily individualistic education for adulthood. In the postmodern religious system, the postmodern harmonious religion complementary to the collective and individualistic societies is the most suitable postmodern religion. Permanent world peace can be achieved by the balanced unified economic, political, educational, and religious systems.
Category: Mind Science

[61] viXra:1208.0072 [pdf] submitted on 2012-08-17 10:39:03

Insuring Existence of Vacuum Energy to Keep Computational “BITS” Present at Start of Cosmological Evolution, Even if Initial Spatial Radius Goes to Zero, not Planck Length

Authors: A.W. Beckwith
Comments: 21 Pages.

This construction preserves a minimum non zero vacuum energy, and in doing so keep the bits, for computational bits cosmological evolution even if initial radius goes to zero . We state that the presence of computational bits is necessary for cosmological evolution to commence. We also state that entanglement entropy not disappearing even if 4 dimensional length disappears is similar to the problem of a higher dimensional surface area at the tangential sheet intersecting a point of space-time, so that the presence of non zero entanglement entropy, embedded on an infinitesmally thick space-time thickness is the same as embedding the known 4 dimensional universe into a higher dimensional structure, as hypothesized by Steinhardt and Turok and Durrer.
Category: Quantum Gravity and String Theory

[60] viXra:1208.0071 [pdf] submitted on 2012-08-16 15:43:12

The Mass of the Electron-Neutrino Expressed by Known Physical Constants

Authors: Laszlo I. Orban
Comments: 5 Pages.

Many trials attempted to understand the neutrinos ever since Pauli theoretically concluded its existence from the conservation of energy calculations. The present paper demonstrates that commencing from two appropriately chosen measurement systems, the mass of the electron-neutrino can be calculated from the mass of the electron and the fine-structure constant. The mass of the neutrino can be determined by the theoretically derived expression (m_k=\alpha^3 m_e) (m_k is the mass of the neutrino, m_e is the mass of electron, alpha is the fine-structure constant).
Category: Mathematical Physics

[59] viXra:1208.0070 [pdf] submitted on 2012-08-16 17:24:24

From a Problem of Geometrical Construction to the Carnot Circles

Authors: Ion Patrascu, Florentin Smarandache
Comments: 4 Pages.

In this article we’ll give solution to a problem of geometrical construction and we’ll show the connection between this problem and the theorem relative to Carnot’s circles.
Category: Geometry

[58] viXra:1208.0069 [pdf] submitted on 2012-08-16 17:33:09

Neutrosofia, O Nouă Ramură a Filosofiei

Authors: Florentin Smarandache
Comments: 10 Pages.

În această lucrare este prezentată o nouă ramură a filosofiei, numită neutrosofie, care studiază originea, natura, şi scopul neutralităţilor, precum şi interacţiunile lor cu diferite spectre de ideatic. Teza fundamentală: Orice idee este T% adevărată, I% nedeterminată şi F% falsă, unde T, I, F sunt submulţimi standard sau non-standard incluse în intervalul non-standard ]-0, +1 [. Teoria fundamentală: Fiecare idee tinde să fie neutralizată, diminuată, echilibrată de idei (nu numai , cum a susţinut Hegel) - ca o stare de echilibru. Neutrosofia stă la baza logicii neutrosofice, o logică cu valoare multiplă care generalizează logica fuzzy, la baza mulţimii neutrosofice care generalizează mulţimea fuzzy, la baza probabilităţii neutrosfice şi a statisticilor neutrosofice, care generalizează probabilitatea clasică şi imprecisă şi respectiv statisticile.
Category:
General Science and Philosophy

[57] viXra:1208.0068 [pdf] submitted on 2012-08-16 18:50:12

The Periodic Table and the MCAS Electron Orbital Model

Authors: Joel M Williams
Comments: 4 Pages. This document was accepted for inclusion in the 3rd International Conference on the Periodic table held August 14-16, 2012 in Cusco, Peru.

A useful periodic table of the elements provides a number of important facts for its user: valence, metallic vs non-metallic compound formation and atomic weight being some of the more valuable. The current popular version has proven quite adequate in this regard for many chemists. A stubborn adherence to the stoic spdf-quantum-theory model, however, creates problems. Thus, this perspective requires that H be placed over Li and/or F even though it has little or no chemical similarity to either of them. This view of the electron structure of atoms also requires a number of “rules” and “hybridizations” to explain even the simplest aspects surrounding the electron structure and chemical bonding; mathematical orthogonality between periodic levels is a sham; and increasingly weirder shaped orbitals are needed to make the math fit the stoic spdf model. Transition and rare-earth/lanthanide series appear without physical rationale. In light of all these spdf-associated problems, the arrangement and behavior of the elements in the periodic table is addressed in this document according to the dynamic MCAS electron orbital model.
Category: Quantum Physics

[56] viXra:1208.0067 [pdf] replaced on 2012-09-06 05:01:33

Secrets of the Two of Concepts of Relativity Theory

Authors: V. V. Demjanov
Comments: 14 Pages. English and Russian

     Once again raises the question of the existence of two concepts of the theory of relativity and the correctness of that which is best confirmed by experiments. The first of these, aether-dinamic theory of relativity (ADTR), been formed during the period of 1880-1904 years the due to efforts of Lorentz and Poincare, on basis Maxwell's theory. In ADTR all the tangible manifestation and of attitude originate in the aether. In case of refusal to from aether, there is lost for ever the possibility to understand the mechanism of transverse polariza-tion and transferring waves light through aether, that hindered the search of methods to detect the absolute motion of inertial objects with respect to the "stationary" aether.      The second (the concept of SRT, 1905), is a "ADTR without aether" since it is postulated in an unobservability of absolute motion, and the "relativism ADTR" in SRT is reduced to kinematics of moving pairs of objects in the "empty space". Conjecture by Maxwell (1877) about the existence of anisotropy of velocity (c) of light in aether, that is filled with particles translational moving together with the Earth, more than 100 years casts doubt on the legitimacy of the failure from the aether. Systematic ob-servations of non-zero effects of the 2nd-order of υ/c reveal the anisotropy of the speed of light. I am directly received the experimental proof of the observability of the effects of 2nd-order only in the "mix" of particles with aether (1968). Their magnitude turned out proportional to the po-larization contribution to Δε particles to the total permeability of the medium ε=1.+Δε. The necessity of the appealing to aether (εaether=1.!?) in experiments of Michelson creates the experimental fact of coercing SRT to return to the conception ADTR.      As for the test of SRT, "of the relativistic practice" (this is: non-Galilean a Lorentz invariance of the mathematical forms laws; the relativistic velocity addition rule; the transverse Doppler effect of the order υ2/c2; concept of mc2; relativistic Hamiltonian, which managing by mass particle accelerators; relativistic electrodynamics, and much more), that all this were born either on base ideas ADTR, which appeared before 1904, or achieved by the collective mind "relativists of SRT" of the 20th century, whom aether did not impede but rather helped.
Category: Relativity and Cosmology

[55] viXra:1208.0066 [pdf] replaced on 2012-10-26 19:43:33

The Mythos of a Theory of Everything

Authors: Armin Nikkhah Shirazi
Comments: 11 Pages. Final version of paper as it appeared on the fourth FQXi essay contest

A fundamental assumption embedded in our current worldview is that there exists an as yet undiscovered `theory of everything', a final unified framework according to which all interactions in nature are but different manifestations of the same underlying thing. This paper argues that this assumption is wrong because our current distinct fundamental theories of nature already have mutually exclusive domains of validity, though under our current worldview this is far from obvious. As a concrete example, it is shown that if the concepts of mass in general relativity and quantum theory are distinct in a specific way, their domains become non-overlapping. The key to recognizing the boundaries of the domains of validity of our fundamental theories is an aspect of the frame of reference of an observer which has not yet been appreciated in mainstream physics. This aspect, called the dimensional frame of reference(DFR), depends on the number of length dimensions that constitute an observer frame. Edwin Abbott's Flatland is used as point of departure from which to provide a gentle introduction to the applications of this idea. Finally, a metatheory of nature is proposed to encompass the collection of theories of nature with mutually exclusive domains of validity.
Category: Quantum Gravity and String Theory

[54] viXra:1208.0064 [pdf] submitted on 2012-08-15 18:44:06

The Wave Equation and Rotation

Authors: Gary D. Simpson
Comments: 32 Pages.

A vector solution to the spherical wave equation is presented. The solution requires rotation of the wave media and a minor revision to the form of the wave equation.
Category: Quantum Physics

[53] viXra:1208.0063 [pdf] submitted on 2012-08-15 10:51:00

On the Travelling Salesman Algorithm: An Application

Authors: David Grace, Alessandro Waldron, Tahir Ahmad
Comments: 7 Pages.

The aim of this paper is to set up a simulation model of the production process of an aircraft company in order to obtain a tool for process analysis and decision support. To achieve this object has been used ProModel as simulation software. The advantages of all tools used in a correct and efficient internal movement, the different layouts and the possible usable materials handling system.
Category: Artificial Intelligence

[52] viXra:1208.0062 [pdf] submitted on 2012-08-15 02:55:38

Einstein or Newton

Authors: Emil Gigov
Comments: 3 Pages.

In the Theory of relativity there are fundamental internal contradictions. The most direct of them is, between the two sides of one unequal equation, intended to transform time. These contradictions prove, that this theory is absolutely wrong.
Category: Relativity and Cosmology

[51] viXra:1208.0060 [pdf] replaced on 2012-08-28 14:51:49

Coupling Constants and the Fields of Space

Authors: Richard A. Peters
Comments: 15 Pages.

Conjecture on Coupling Constants and the Fields of Space Abstract Two fields of space are introduced: a temporal field that supports the propagation of photons and an inertial field that is involved in the inertial reaction of matter particles. Arguments are advanced to support the assertion that both the temporal and inertial fields are subject to gravity and, hence, the fields comprise a single temporal-inertial (TI) field. Coupling among the TI field, matter particles and gravity affects measures of inertia, mass and the gravitational constant. The coupling of a matter particle with the TI field is a measure of the inertial mass of that particle. Acceleration of a matter particle with respect to the TI field produces the inertial reaction force. It follows that 1) acceleration of particles of the TI field by gravity transmits the gravitational force to matter particles that then move with the same acceleration and 2) matter particles are not directly subject to the gravitational force. The TI field supports the propagation of light that moves at the velocity c relative to the field. A third field, the static field is not subject to gravity, but is coupled with the TI field and counteracts the acceleration of particles of the TI field in their response to gravity.
Category: Relativity and Cosmology

[50] viXra:1208.0059 [pdf] replaced on 2012-11-08 13:21:59

Wrong Assumptions vs Right Assumptions?

Authors: Yuri Danoyan
Comments: 8 Pages. Extended version

Assumptions of physics which need revision: 1)4Dspacetime continiuum. 2)Gravity as a fundamental interaction. 3) 3 fundamental dimensional constants (G, c, h).Alternatives have been proposed: 1. Splitting 3D discrete space from 1D continues time.2.Gravitation as a emergent effect of the Universe. 3.Only Planck constant as a fundamental dimensional and as a consequence only the Planck mass unit have sense.
Category: High Energy Particle Physics

[49] viXra:1208.0058 [pdf] submitted on 2012-08-13 23:15:22

Gravitational Waves

Authors: Kenneth Dalton
Comments: 11 Pages. Journal Ref: Hadronic J. 35(2), 209-220 (2012)

The proposed theory of gravitation is summarized, with a focus on dynamics. The linearized field equations are applied to gravitational waves. The theory predicts that longitudinal waves would be detected, which exert a force in the direction of propagation. It also explains the failure at LIGO and elsewhere to find transverse gravitational waves.
Category: Relativity and Cosmology

[48] viXra:1208.0057 [pdf] submitted on 2012-08-13 12:47:23

Derivation of Three Fundamental Masses and Large Numbers Hypothesis by Dimensional Analysis

Authors: Dimitar Valev
Comments: 12 pages

Three mass dimension quantities have been derived by dimensional analysis by means of fundamental constants – the speed of light in vacuum (c), the gravitational constant (G), the Planck constant (h_bar) and the Hubble constant (H). The extremely small mass m1 ~ (h_bar*H)/c^2 ~ 10^(-33) eV has been identified with the Hubble mass mH, which seems close to the graviton mass mG. The enormous mass m2 ~ c^3/(G*H) ~ 10^53 kg is close to the mass of the Hubble sphere and practically coincides with the Hoyle-Carvalho formula for the mass of the observable universe. The third mass m3 ~ [(H*h_bar^3)/G^2]^(1/5) ~ 10^7 GeV could not be unambiguously identified at present time. Besides, it has been found remarkable fact that the Planck mass mPl ~ Sqrt [(h_bar*c)/G] appears geometric mean of the extreme masses m1 and m2. Finally, the substantial large number N = Sqrt [c^5/(2*G*h_bar*H^2) ≈ 5.73×10^60 has been derived relating cosmological parameters (mass, density, age and size of the observable universe) and fundamental microscopic properties of the matter (Planck units and Hubble mass). Thus, a precise formulation and proof of Large Numbers Hypothesis (LNH) has been found.
Category: Relativity and Cosmology

[47] viXra:1208.0056 [pdf] submitted on 2012-08-13 14:31:59

Consciousness, Time and Complexity

Authors: Vahid R. Ramezani
Comments: 11 Pages.

We explore the connection between the mind and the brain. We propose that consciousness is the consequence of processing information and that the solution to the binding problem does not entail quantum mechanical coherence or entanglement. We argue for an alternative inspiration from quantum mechanics and quantum field theory based on time-energy uncertainly: not to reduce consciousness to a quantum wave function but to see what quantum mechanics teaches us about information, time, complexity and transformation. We introduce three postulates and a law governing cognitive systems.
Category: General Science and Philosophy

[46] viXra:1208.0055 [pdf] submitted on 2012-08-12 12:01:49

Two Kinds of Potential Difference for a Capacitor

Authors: Hamid V. Ansari
Comments: 5 Pages.

It is shown that contrary to this current belief that the electrostatic potential difference between the two conductors of a capacitor is the same potential difference between the two poles of the battery which has charged it, the first is two times more than the second. We see the influence of this in the experiments performed for determination of charge and mass of the electron.
Category: Classical Physics

[45] viXra:1208.0053 [pdf] submitted on 2012-08-12 10:38:58

On an Application of Bayesian Estimation

Authors: Kiyoharu Tanaka, Evgeniy Grechnikov
Comments: 5 Pages.

This paper explains the Bayesian version of estimation as a method for calculating credibility premium or credibility number of claims for short-term insurance contracts using two ingredients: past data on the risk itself and collateral data from other sources considered to be relevant. The Poisson/gamma model to estimate the claim frequency for portfolio of policies and Normal/normal model to estimate the pure premium are explained and applied.
Category: Statistics

[44] viXra:1208.0052 [pdf] replaced on 2014-01-05 14:30:48

Newtonian Physics Can Explain Relativistics Experiments Like Mass Variation, Time Dilation, Michelson Morley, Doppler Effect, Etc.

Authors: Silas Sacheli Santos
Comments: 17 Pages.

Using Newtonian Physics we can explain some relativistics experiments like mass variation, time dilation, Michelson Morley, transverse Doppler effect, relation mass-energy, etc. In some explanations the mathematical equations is of initial level using numeric calculations and some approches and we have a medium agreement. So, we need to continue the development for the complete equations and verify if a better agreement exists. So, Newtonian Physics needs more research before being considered without validity.
Category: Relativity and Cosmology

[43] viXra:1208.0051 [pdf] replaced on 2013-06-14 22:02:09

Do Heavy Gauge Bosons Exist?

Authors: Kenneth Dalton
Comments: 4 Pages. Journal-Ref: Hadronic J, 39(1), 67 (2016)

Recently, transverse and longitudinal solutions were found for the W and Z bosons. They satisfy the nonlinear cubic wave equation. Here, new solutions are given which correspond to particles with mass m'_W = 114 GeV and m'_Z = 129 GeV. A neutral particle has been found at CERN near 129 GeV. It remains to be seen whether charged particles will be discovered at 114 GeV.
Category: High Energy Particle Physics

[42] viXra:1208.0050 [pdf] submitted on 2012-08-12 06:16:56

Unus Mundus is the Immanent + Transcendent ?

Authors: Elemer E Rosinger
Comments: 4 Pages.

Recently, in [3], a non-ontological definition of ontology was suggested with the help of four questions. Here several immediate developments are presented.
Category: General Science and Philosophy

[41] viXra:1208.0049 [pdf] submitted on 2012-08-12 06:17:32

种飞机图像目标多特征信息融合识别方法

Authors: Xin-De Li, Wei-Dong Yang, Jean Dezert
Comments: 10 Pages.

种基于概率神经网络(Probabilistic neural networks, PNN) 和DSmT 推理(Dezert-Smarandache theory) 的飞机图像目标多特征融合识别算法. 针对提取的多个图像特征量, 利用数据融合的思想对来自图像目标各个特征量提供的 信息进行融合处理. 首先, 对图像进行二值化预处理, 并提取Hu 矩、归一化转动惯量、仿射不变矩、轮廓离散化参数和奇异 值特征5 个特征量; 其次, 针对Dezert-Smarandache Theory 理论中信度赋值构造困难的问题, 利用PNN 网络, 构造目标识别率矩阵, 通过目标识 别率矩阵对证据源进行信度赋值; 然后, 用DSmT 组合规则在决策级层进行融合, 从而完成对飞机目标的识别; 最后, 在目标 图像小畸变情形下, 将本文提出的图像多特征信息融合方法和单一特征方法进行了对比测试实验, 结果表明本文方法在同等 条件下正确识别率得到了很大提高, 同时达到实时性要求, 而且具有有效拒判能力和目标图像尺寸不敏感性. 即使在大畸变情 况下, 识别率也能达到89.3 %.
Category: Artificial Intelligence

[40] viXra:1208.0048 [pdf] submitted on 2012-08-12 02:37:48

What is Mass? Chapter One: Mass in Newtonian Mechanics and Lagrangian Mechanics

Authors: Xiong Wang
Comments: 13 Pages. author name: Xiong WANG Email:wangxiong8686@gmail.com

``To see a World in a Grain of Sand, And a Heaven in a Wild'' We will try to see the development and the whole picture of theoretical physic through the evolution of the very fundamental concept of mass. 1The inertial mass in Newtonian mechanics 2 The Newtonian gravitational mass 3 Mass in Lagrangian formulism 4 Mass in the special theory of relativity 5 $E = MC^2$ 6 Mass in quantum mechanics 7 Principle of equivalence and general relativity 8 The energy momentum tensor in general relativity 9 Mass in the standard model of particle physics 10 The higgs mechanism
Category: Mathematical Physics

[39] viXra:1208.0047 [pdf] submitted on 2012-08-11 04:24:17

Four Questions Can Define the Transcendental ?

Authors: Elemer E Rosinger
Comments: 3 Pages.

Usual definitions of the {\it transcendental} are given by {\it ontological assumptions}. Typical in this regard are those in various theologies or philosophies. And needless to say, such ontological assumptions can easily be challenged, if not in fact, they actually do invite such challenges. Plato's Cave Allegory in his book "Republic" is an exception, since it can be seen as a definition of the transcendental, albeit rather indirectly and through a quite involved story. And as such, it is not at all about any ontological assumption, but only about gnoseology, epistemology and pragmatics. Here, a similar definition of the transcendental is suggested, namely, a definition which does not use any ontological assumption, and instead, it only refers to gnoseology, epistemology and pragmatics. The novelty is in the fact that the mentioned definition consists of nothing more than four successive {\it questions}.
Category: General Science and Philosophy

[38] viXra:1208.0046 [pdf] submitted on 2012-08-10 07:26:08

Time Dependence of the Masses of the Pions

Authors: Lello Boscoverde
Comments: 2 Pages.

Recent work in Eddingtonian cosmology has demonstrated the relation of the visible mass of the universe to the spacial extent of the pions. Building on this finding, we find the masses of the pions themselves are dependent on the age of the universe.
Category: Relativity and Cosmology

[37] viXra:1208.0045 [pdf] submitted on 2012-08-10 16:16:52

Variation of Vacuum Energy if Scale Factor Becomes Infinitely Small, with Fixed Entropy Due to a Non Pathological Big Bang Singularity Accessible to Modified Einstein Equations

Authors: A.W. Beckwith
Comments: 11 Pages.

Thought experiment as to how to possibly use the idea of minimum time length, even if spatial length goes to zero, in early phases of cosmology, as a response to Stoica's paper on a re done version of the Einstein equations
Category: Quantum Gravity and String Theory

[36] viXra:1208.0044 [pdf] replaced on 2013-01-13 00:53:01

Information and Physics

Authors: Serge Mijatovic
Comments: 14 Pages. I accidentally submited change for this, I meant replacement. I am sorry. This is what I inteded.

What would the Universe look like if information processing was at its very core? What is the most likely and optimum way of this information use? This paper explores a fundamental scenario of a possible connection between information and physics. Since the approach taken is antecedent to first principles of physics, we will rely only on axiomatic notions of information use, and deliberately ignore the body of physics to avoid conclusions isomorphic to it. Essential relativistic, gravitational and quantum phenomena are derived as limiting cases, including time dilation effect of Einstein’s relativity theories as a special limiting case.
Category: Relativity and Cosmology

[35] viXra:1208.0043 [pdf] submitted on 2012-08-09 20:27:21

Vector Gauge Boson Dark Matter for the Su(n) Gauge Group Model

Authors: E.Koorambas
Comments: 17 Pages.

The existence of dark matter is explained by a new stable neutral vector boson, C-boson, of mass (900GeV), predicted by the Wu mechanisms for mass generation of gauge field. According to the Standard Model (SM) W, Z-bosons normally get their masses through coupling with the SM Higgs particle of mass 125GeV. We compute the self-annihilation cross section of the vector gauge boson C-dark matter and calculate its relic abundance. We also study the constraints suggested by dark-matter direct-search experiments.
Category: High Energy Particle Physics

[34] viXra:1208.0042 [pdf] submitted on 2012-08-09 20:49:58

Why Baryons May Be Yang-Mills Magnetic Monopoles

Authors: Jay R. Yablon
Comments: 11 Pages.

We demonstrate that Yang-Mills Magnetic Monopoles naturally confine their gauge fields, naturally contain three colored fermions in a color singlet, and that mesons also in color singlets are the only particles they are allowed to emit or absorb. This makes them worthy of serious consideration as baryons.
Category: High Energy Particle Physics

[33] viXra:1208.0041 [pdf] replaced on 2014-04-08 23:14:16

On the W and Z Masses

Authors: Kenneth Dalton
Comments: 17 Pages. Journal-Ref: Hadronic J. 38(4), 429-445 (2015)

Scalar and vector fields are coupled in a gauge invariant manner, such as to form massive vector fields. In this, there is no condensate or vacuum expectation value. Transverse and longitudinal solutions are found for the W and Z bosons. They satisfy the nonlinear cubic wave equation. Total energy and momentum are calculated, and this determines the mass ratio m_W/m_Z.
Category: High Energy Particle Physics

[32] viXra:1208.0039 [pdf] submitted on 2012-08-09 12:50:37

If Initial Length of Space-Time Goes to a Non Pathological Big Bang Singularity Initially, What Happens to Vaccuum Energy Evolution?

Authors: A.W. Beckwith
Comments: 15 Pages.

what happens to vacuum energy, if Stoica's calculations , 2012, remove necessity of avoiding analyzing BB singularity for Einstein's equations
Category: Quantum Gravity and String Theory

[31] viXra:1208.0038 [pdf] submitted on 2012-08-09 12:52:59

Limit Treatment of Space-Time Foam Calculations if Initial Length of Space-Time Collapses to a Non Pathological Big Bang Singularity Accessible to Modified Einstein Equations

Authors: A.W. Beckwith
Comments: 16 Pages.

Problems with space-time foam, if BB singularities no longer are inaccessible to modified version of Einstein equations, as given by Stoica (2012)
Category: Quantum Gravity and String Theory

[30] viXra:1208.0037 [pdf] submitted on 2012-08-09 12:55:56

Vacuum Energy and Extension to Machian Space-Time Physics by Linkage of Gravitons (Today) with Gravitinos (Ew Era) as Answer to VOLOVIK’S Vacuum Energy, Myths and Reality Document Questions

Authors: A.W. Beckwith
Comments: 15 Pages.

How to answer Volivik's questions in Vacuum energy, myths and realities document , if BB singularities no longer are inaccessible to modified version of Einstein equations, as given by Stoica (2012); We summarize first the highlights of the paper by Volovik, 2006 as to Vacuum energy and afterwards state that the non applicabillty of “Fjortoft theorem” for defining necessary conditions for instability solves certain problems raised by Volovik. The Myths and Realitites of Vacuum energies as stated by Volovik in particular state that there can be no (local?) perturbations of the quantum vacuum leading to a nonzero vacuum energy. Our paper applies the non applicability of “Fjortoft theorem” as another mechanism which could lead to a nonzero vacuum energy. We apply this theorem to what is called by Padmanabhan a thermodynamic potential which could show inititial conditions implying (structual) instability if conditions for the applications of “Fjortoft’s theorem” hold. In our case, there is no instability, so a different mechanism exists.for constructing vacuum energy. We appeal to Machians physics to account for the behavior of massive Gravitons with DE, in sync with extending answers to Volovick’s questions and identifying vacuum energy with DE. Then use Branes-Anti branes to create DE. Key point also is in the uniformity of Planck’s constant in cosmology, too as to preserve consistency of physical evolution.
Category: Quantum Gravity and String Theory

[29] viXra:1208.0036 [pdf] submitted on 2012-08-08 17:19:13

Mathematical Follow-up for Dark Energy and Dark Matter in the Double Torus Universe.

Authors: Dan Visser
Comments: 7 Pages.

The main issue in this paper is my mathematics to be presented about the maximum of dark energy depending on the information-differences on the wall of any volume in the Double Torus. Secondly the expressions must be worked out further by invitation to them how are triggered by my ideas the universe has a Double Torus geometry. Thirdly I go deeper into details with dark matter, not only stating dark matter is a spatial particle that spins and gets its energy from its acceleration into a dark matter torus, but also pretending dark matter gets its mass from the vacuum energy. I lay out the conditions for understanding why the Big Bang dynamics is therefore a part of the Double Torus and how the dark flow in the universe emerge from the Double Torus dark energy equation. Fourthly I refer to the pretention neutrinos should be sensitive for the flow of dark matter particles expressed in the set of equations in a former paper. But extensively this paper amplifies this theoretical neutrino-evidence, despite all the confusion around the truth of neutrinos-faster-than-light. Fifthly I observe some dark energy and dark matter issues from some of my former papers.
Category: Mathematical Physics

[28] viXra:1208.0035 [pdf] replaced on 2012-09-17 13:37:57

A Precise Information Flow Measure from Imprecise Probabilities; in Slides

Authors: Sari Haj Hussein
Comments: 44 Pages.

This is a slide presentation of the paper entitled "A Precise Information Flow Measure from Imprecise Probabilities", which can be found at: http://dx.doi.org/10.1109/SERE.2012.25.
Category: General Science and Philosophy

[27] viXra:1208.0034 [pdf] submitted on 2012-08-09 10:19:22

The Direction of Gravity

Authors: Richard J. Benish
Comments: 5 Pages. Free, properly attributed, unaltered distribution is encouraged

How much do we really know about gravity? Though our knowledge is sufficient to send people to the Moon, there is a large and fundamental gap in our empirical data; and there are basic questions about gravity that are rarely even asked, and so remain unanswered. The gap concerns the falling of test objects near the centers of larger gravitating bodies. Newton's theory of gravity and Einstein's theory, General Relativity, though giving essentially the same answers, describe the problem quite differently. A discussion of this difference--which emphasizes the role of clock rates in Einstein's theory--evokes a question concerning the most basic characteristic of any theory of gravity: Is the motion due to gravity primarily downward or upward; i.e., inward or outward? Have our accepted theories of gravity determined this direction correctly? The answer to this question may seem obvious. We will find, however, that we don't really know. And most importantly, it is emphasized that we can get an unequivocal answer by performing a relatively simple laboratory experiment.
Category: Relativity and Cosmology

[26] viXra:1208.0033 [pdf] replaced on 2012-08-19 14:23:43

The Helical Structure of the Electromagnetic Gravity Field

Authors: Frank H. Makinson
Comments: 7 Pages. Classical physics can be used to describe the electromagnetic field structure of the phenomenon called gravity

Within the universe an influence creates processes and forms with helicity and spin, and there is a handedness bias. The scale of objects with helicity and spin goes from galaxies to miniscule atomic structures. A helical structure has chiral properties which allows helical structures of the same type at 180° to each other to effortlessly merge. Gravity with a helical electromagnetic field structure can influence the helicity and spin of everything. The form of the helical electromagnetic field structure that produces an attractant only type gravity force is described.
Category: Classical Physics

[25] viXra:1208.0032 [pdf] replaced on 2018-03-27 17:01:22

House of Cards Built One Meter at a Time

Authors: Frank H. Makinson
Comments: 6 Pages.

A physical law assumption is based upon a knowledge set extracted using observation and measurement techniques available at the time the assumption was made. An assumption can stifle scientific inquiry if it is allowed to become a protected paradigm, and thus, unchallengeable. Units of measure are a core element of physical law inquiry and an erroneous assumption used in selecting the base units can hinder the inquiry process significantly.
Category: History and Philosophy of Physics

[24] viXra:1208.0031 [pdf] replaced on 2013-12-16 09:05:48

Clumpy Dark Matter Around Dwarf Galaxies a Support for an Alternative Black Hole Theory According to the Quantum Function Follows Form Model.

Authors: Leo Vuyk
Comments: 21 Pages. 21

In particle physics it is an interesting challenge to postulate that the FORM and structure of elementary particles is the origin of different FUNCTIONS of these particles. In our former paper “3-Dimensional String based alternative particle model.” we presented a possible solution based on complex 3-D ring shaped particles. We will give it the name: FFF Theory. (Function Follows Form Theory) Now this paper presents the possible consequences of such a 3-D string particle system for Black holes. Black Holes should be existent at all scales from microscopic ( Ball Lightning) up to supergiant Big Bang splinter Black Holes (Galaxy Anchor Black Holes: GABHs) and the one and only origin for all Dark Matter. Recent clumpy Dark matter observations around Dwarf galaxies, seem to support the new paradigm black hole of Q-FFF theory.
Category: Astrophysics

[23] viXra:1208.0030 [pdf] replaced on 2013-08-08 04:43:32

Synchronous Interlocking of Discrete Forces: Strong Force Reconceptualised in a NLHV Solution

Authors: Dirk Pons, Arion D. Pons, Aiden J. Pons
Comments: Pages. Published as: Pons, D.J., A.D. Pons, and A.J. Pons, Synchronous interlocking of discrete forces: Strong force reconceptualised in a NLHV solution Applied Physics Research, 2013. 5(5): p. 107-126. DOI: http://dx.doi.org/10.5539/apr.v5n5107

The conventional requirements for the strong force are that it is strongly attractive between nucleons whether neutral neutrons or positively charged protons; that it is repulsive at close range; that its effect drops off with range. However theories, such as quantum chromodynamics, based on this thinking have failed to explain nucleus structure ab initio starting from the strong force. We apply a systems design approach to this problem. We show that it is more efficient to conceptualise the interaction as interlocking effect, and develop a solution based on a specific non-local hidden-variable design called the Cordus conjecture. We propose that the strong force arises from particules synchronising their emission of discrete forces. This causes the participating particules to be interlocked: the interaction pulls or repels particules into co-location and then holds them there, hence the apparent attractive-repulsive nature of that force and its short range. Those discrete forces are renewed at the de Broglie frequency of the particule. The Cordus theory answers the question of how the strong force attracts the nucleons (nuclear force). We make several novel falsifiable predictions including that there are multiple types of synchronous interaction depending on the phase of the particules, hence cis- and trans-phasic bonding. We also predict that this force only applies to particules in coherent assembly. A useful side effect is that the theory also unifies the strong and electro-magneto-gravitation (EMG) forces, with the weak force having a separate causality. The synchronous interaction (strong force) is predicted to be intimately linked to coherence, with the EMG forces being the associated discoherent phenomenon. Thus we further predict that there is no need to overcome the electrostatic force in the nucleus, because it is already inoperative when the strong force operates. We suggest that ‘strong’ is an unnecessarily limiting way of thinking about this interaction, and that the ‘synchronous’ concept offers a more parsimonious solution with greater explanatory power for fundamental physics generally, and the potential to explain nuclear mechanics.
Category: Nuclear and Atomic Physics

[22] viXra:1208.0029 [pdf] replaced on 2012-10-30 20:33:20

Lepton Electric Charge Swap at the 10 Tev Energy Scale

Authors: E.Koorambas
Comments: 18 Pages.

We investigate the possibility that a neutral non-regular lepton, of mass 1784 MeV, and a charged non-regular lepton, of mass 35 MeV exists in six-dimensional space-time. This proposition provides a global rotational symmetry between ordinary third family of leptons and proposed non-regular leptons. The electric charge swap between ordinary leptons produces heavy neutral non-regular leptons of mass 1784 MeV, which may form cold dark matter. The existence of these proposed leptons can be tested once the Large Hadron Collider (LHC) becomes operative at the 10 TeV energy-scale. This proposition may have far reaching applications in astrophysics and cosmology.
Category: High Energy Particle Physics

[21] viXra:1208.0026 [pdf] submitted on 2012-10-01 12:44:19

Internuclear Separations Using Least Squares and Neural Networks for 46 New S and P Electron Diatomics

Authors: Ray Hefferlin
Comments: 9 Pages. May appear in Int. J. Mol. Model, Vol 4, #1

Combined least-squares and neural-network forecasts for internuclear separations of main-group diatomic molecules, most with from 9 to 12 atomic valence electrons, are presented. We require that the standard-deviation bounds of the forecasts overlap each other; this requirement is met by 65 molecules, of which 46 seem not to have been studied previously. The composite errors average 0.1036Å on either side of the composite predictions. There is agreement with 33 of 41 independent test data (80.5%), and those not in agreement fall outside the composite error limits by an average of 1.83%.
Category: Chemistry

[20] viXra:1208.0024 [pdf] submitted on 2012-08-07 05:27:27

Eight Assumptions of Modern Physics Which Are Not Fundamental

Authors: Juan Ramón González Álvarez
Comments: 9 Pages. FQXi 2012 Essay Contest "Which of Our Basic Physical Assumptions Are Wrong?"

This essay considers eight basic physical assumptions which are not fundamental: (i) spacetime as the arena for physics, (ii) unitarity of the dynamics, (iii) microscopic time-reversibility, (iv) the need for black hole thermodynamics, (v) state vectors as the general description of quantum states, (vi) general relativity as a field theory, (vii) dark matter as real matter, (viii) and cosmological homogeneity. This selection ranges from micro-physics to cosmology, but is not exhaustive.
Category: Quantum Gravity and String Theory

[19] viXra:1208.0023 [pdf] submitted on 2012-08-06 18:04:54

A Novel Framework for Interpreting Quantum Mechanics

Authors: Armin Nikkhah Shirazi
Comments: 1 Page. original date of publication was 4/25/2012

This is an abstract article published in the proceedings of the "Quantum Malta 2012: fundamental Problems in Quantum Physics" Conference
Category: Quantum Physics

[18] viXra:1208.0022 [pdf] replaced on 2014-04-03 22:09:17

On Legendre's, Brocard's, Andrica's, and Oppermann's Conjectures

Authors: Germán Paz
Comments: 11 Pages. The title and the abstract have been modified; a few references have been added, as it has been suggested to the author. This paper is also available at arxiv.org/abs/1310.1323.

Let $n\in\mathbb{Z}^+$. Is it true that every sequence of $n$ consecutive integers greater than $n^2$ and smaller than $(n+1)^2$ contains at least one prime number? In this paper we show that this is actually the case for every $n \leq 1,193,806,023$. In addition, we prove that a positive answer to the previous question for all $n$ would imply Legendre's, Brocard's, Andrica's, and Oppermann's conjectures, as well as the assumption that for every $n$ there is always a prime number in the interval $[n,n+2\lfloor\sqrt{n}\rfloor-1]$.
Category: Number Theory

[17] viXra:1208.0021 [pdf] submitted on 2012-08-06 08:44:18

Smarandache 函数 及其相关问题研究 Vol.8 / Research on Smarandache Functions and Other Related Problems, Vol. 8 [in Chinese Language Only]

Authors: Wang Tingting, Liu Yanni
Comments: 141 Pages.

本书主要介绍Smarandache 函数与伪Smarandache 函数、几类Smarandache 序列及其它相关问题的 最新研究进展, 同时还介绍了一些其它数论问题的研 究成果. 本书各部分内容编排相对独立,有兴趣的读者 可以从阅读本书任何一个章节开始, 开拓读者视野,激发读者对Smarandache 相关问题的研究兴趣. This book mainly introduces the latest research progress of Smarandache functions and pseudo-Smarandache functions, several kinds of Smarandache sequences and other related problems. At the same time it also introduces some other study in number theory. This book commits itself to relatively independent content arrangement with each chapter and section, every reader who is interested in this book can read any chapter for the beginning. It could open up readers’ perspective; arouse readers to study Smarandache problems.
Category: Number Theory

[16] viXra:1208.0020 [pdf] submitted on 2012-08-07 13:16:43

Smarandache Weak BE-Algebras

Authors: Arsham Borumand Saeid
Comments: Pages.

In this paper, we introduce the notions of Smarandache weak BE-algebra, Q-Smarandache filters and Q-Smarandache ideals. We show that a nonempty subset F of a BE-algebra X is a Q-Smarandache filter if and only if A(x, y) is included in or equal to F, which A(x, y) is a Q-Smarandache upper set. The relationship between these notions are stated and proved.
Category: Algebra

[15] viXra:1208.0019 [pdf] submitted on 2012-08-06 05:59:47

On the Affine Nonlinearity in Circuit Theory

Authors: Emanuel Gluskin
Comments: 34 Pages. This is the set of the slides for my first NDES 2012 lecture, which significantly extends the content of the associated proceedings article.

According to the definition of the linear operator, as accepted in system theory, an affine dependence is a nonlinear one. This implies the nonlinearity of Thevenin's 1-port, while the battery itself is a strongly nonlinear element that in the 1-port's "passive mode" (when the 1-port is fed by a "stronger" circuit) can be replaced by a hardlimiter. For the theory, not the actual creation of the equivalent 1-port, but the selection of one of the ports of a (linear) many-port for interpreting the circuit as a 1-port, is important. A practical example of the affine nonlinearity is given also in terms of waveforms of time functions. Emphasizing the importance of the affine nonlinearity, it is argued that even when straightening the curved characteristic of the solar cell, we retain the main part of the nonlinearity. Finally, the "fractal-potential" and "f-connection-analysis" of 1- ports, which are missed in classical theory, are mentioned.
Category: General Mathematics

[14] viXra:1208.0018 [pdf] submitted on 2012-08-05 18:41:53

The MSRT, the Interpretation of the Lorentz Transformation Equations, Faster Than Light and the Cherenkov Radiation

Authors: Azzam AlMosallami
Comments: 19 Pages.

In this paper, I’ll give an interpretation for the Lorentz transformation equations depending on my Modified Special Relativity Theory MSRT [23]. My interpretation illustrates, the Lorentz factor is equivalent to the refractive index in optics. Also, according to my MSRT, it is possible measuring speeds of particles or electromagnetic waves to be greater than light speed in vacuum, but in this case, there is no violation for the Lorentz transformation or causality, and thus it is keeping on the laws of physics to be the same in all inertial frames of reference. From that I refute the proposed claim by Cohen and Glashow in their paper [33] refuting the OPERA experiment, depending on the analogy of Cherenkov radiation, where this proposed claim is based on a wrong concept to the superluminal speeds, and this wrong concept is based on a flaw that is existed in the special relativity theory of Einstein.
Category: Relativity and Cosmology

[13] viXra:1208.0017 [pdf] submitted on 2012-08-05 21:04:22

The Orbit Motion in the Gravity Field

Authors: sangwha Yi
Comments: 5 Pages.

In the general relativity theory, using Einstein’s gravity field equation, find the solution of the orbit motion in the general relativity theory. Therefore the solution has the angular velocity. According to the solution, the matter rotates in the gravity force.
Category: Relativity and Cosmology

[12] viXra:1208.0016 [pdf] submitted on 2012-08-05 03:46:09

The Mathematical Basis of the Fine Structure Constant

Authors: Robert Tetlow
Comments: 4 Pages.

It can be shown that the Fine Structure Constant can be defined as: α ≡ 1/√(( -2eπ -e + 2πe + eπ2 + π2)) = 0.007297352558.
Category: Quantum Physics

[11] viXra:1208.0013 [pdf] submitted on 2012-08-04 06:41:17

Conclusion of the Lorentz Force Expression (ВЫВОД ВЫРАЖЕНИЯ СИЛЫ ЛОРЕНЦА ИЗ УРАВНЕНИЙ МАКСВЕЛЛА)

Authors: Etkin V.A.
Comments: 5 Pages. In russian

It is shown, that expression of Lorentz magnetic force follows directly from Maxwell equations, if the last are deduced from primary principles and contain full derivatives on time from electric and magnetic fields (Показано, что выражение магнитной составляющей силы Лоренца непосредственно следует из уравнений Максвелла, если последние выведены из первичных принципов и содержат полные производные по времени от электрических и магнитных полей) (
Category: Classical Physics

[10] viXra:1208.0011 [pdf] submitted on 2012-08-03 19:23:15

Two New Constants \niu and \theta and a New Formula \pi = (1/2)e^\theta

Authors: Chen Wenwei
Comments: 6 Pages.

This paper brings to light two new constants and a formula.
Category: Number Theory

[9] viXra:1208.0010 [pdf] submitted on 2012-08-03 10:57:20

The Application of Gödel’s Incompleteness Theorems to Scientific Theories

Authors: Michael James Goodband
Comments: 24 Pages.

It is shown that there-exist conditions for which scientific theories qualify as Gödel’s 'related systems', and that observable features can exist which cannot be derived within the scientific theory. However, this is just a descriptive problem arising due to restricting scientific theories to be in physically-real terms, and can be circumvented by the use of non-physically-real terms, which is shown to give a derivation of Quantum Theory. Incompleteness is also shown to be possible in scientific theories of living cells, ecosystems and the economies of nations. The impact on natural language descriptions of these systems is also considered.
Category: General Science and Philosophy

[8] viXra:1208.0009 [pdf] replaced on 2012-08-11 17:00:38

Irrationality of the Euler-Mascheroni Constant

Authors: Andile Mabaso
Comments: 6 Pages.

In this paper we prove that the Euler-Mascheroni constant is irrational and transcendental.
Category: Number Theory

[7] viXra:1208.0008 [pdf] submitted on 2012-08-02 16:50:58

A Challenge to Quantized Absorption by Experiment and Theory

Authors: Eric Stanley Reiter
Comments: 10 pages. Essay was submitted to 2012 FQXI contest

After recognizing dubious assumptions regarding light detectors, a famous beam-split coincidence test of the photon model was performed with gamma-rays instead of visible light. A similar test was performed to split alpha-rays. Both tests are described in detail to justify conclusions. In both tests, coincidence rates greatly exceeded chance, leading to an unquantum effect. This is a strong experimental contradiction to quantum theory and photons. These new results are strong evidence of the long abandoned accumulation hypothesis, also known as the loading theory, and draw attention to assumptions applied to key past experiments that led to quantum mechanics. The history of the loading theory is outlined, including the loading theory of Planck's second theory of 1911. A popular incomplete version of the loading theory that convinced physics students to reject it is exposed. The loading theory is developed by deriving a wavelength equation similar to de Broglie's, from the photoelectric effect equation. The loading theory is applied to the photoelectric effect, Compton effect, and charge quantization, now free of wave-particle duality. It is unlikely that the loading theory can apply to recent claimed success of giant molecule multi-path interference/diffraction, and that claim is quantitatively challenged. All told, the evidence reduces quantized absorption to an illusion, due to quantized emission combined with newly identified properties of the matter-wave.
Category: Quantum Physics

[6] viXra:1208.0007 [pdf] replaced on 2013-05-23 12:46:39

Ultrafast Conversion of Graphite to Diamond in Gravitational Pressure Apparatus

Authors: Fran De Aquino
Comments: 9 Pages.

Currently the artificial production of diamond is very expensive because it consumes large amounts of energy in order to produce a single diamond. Here, we propose a new type of press based on the intensification of the gravitational acceleration. This Gravitational Press can generate pressures several times more intense than the 80GPa required for ultrafast transformation of graphite into diamond. In addition, due to the enormous pressure that the Gravitational Press can produce, the "synthesis capsule" may be very large (up to about 1000 cm3 in size). This is sufficient to produce diamonds with up to 100 carats (20g) or more. On the other hand, besides the ultrafast conversion, the energy required for the Gravitational Presses is very low, in such a way that the production cost of the diamonds becomes very low, what means that they could be produced on a large scale.
Category: Classical Physics

[5] viXra:1208.0006 [pdf] replaced on 2012-12-02 03:57:01

The Radius of the Proton in the Self-Consistent Model

Authors: Sergey G. Fedosin
Comments: 15 pages. Accepted by Hadronic Journal

Based on the notion of strong gravitation, acting at the level of elementary particles, and on the equality of the magnetic moment of the proton and the limiting magnetic moment of the rotating non-uniformly charged ball, the radius of the proton is found, which conforms to the experimental data. At the same time the dependence is derived of distribution of the mass and charge density inside the proton. The ratio of the density in the center of the proton to the average density is found, which equals 1.57.
Category: Nuclear and Atomic Physics

[4] viXra:1208.0005 [pdf] replaced on 2014-06-13 08:34:01

Discussions on the Hypothesis that Cosmic Ray Exposure on Sirius B Negates Terrestrial MBH Concerns from Colliders

Authors: Thomas B Kerwick
Comments: 16 Pages.

This paper serves to document discussions held in public forum [1] with Prof. Otto E Rössler, chaos theory expert and outsider academic on Large Hadron Collider (LHC) safety procurement. Whereas a recent paper [2] written by Prof. Otto E. Rössler on further implications to Einstein’s equivalence principle suggest that if MBH were created on Earth due to LHC experiments, a topic explored in the earlier work of Giddings & Mangano [3], that these would pose an existential risk to planet Earth though an exponential accretion processes, these discussions serve to countering such claims through hypotheses on stable TeV-scale MBH in such conditions. Whereas G&M have already explored the existence of neutron stars and white dwarfs in their safety assurances on LHC experiments [3], these discussions focused on Sirius B as a simple case study - the white dwarf companion of our closely-proximate Sirius pair – and nearest white dwarf to Earth - with a casual overview on the rate of CR flux on the Sirius pair –and gravitational capture of the products of CR exposure on Sirius B.
Category: High Energy Particle Physics

[3] viXra:1208.0003 [pdf] submitted on 2012-08-02 04:42:15

Angular Precession of Elliptic Orbits. Mercury

Authors: Javier Bootello
Comments: 10 Pages. Comments are welcome

The relativistic precession of Mercury -43.1 seconds of arc per century-, is the result of a secular addition of 5.02 x10-7 rad. at the end of every orbit around the Sun. The question that arises in this paper, is to analyse the angular precession at each single point of the elliptic orbit and determine its magnitude and oscillation around the mean value, comparing key theoretical proposals. Underline also that, this astronomical determination has not been yet achieved, so it is considered that MESSENGER spacecraft, now orbiting the planet, should provide an opportunity to perform it. That event will clarify highlight issues, now that we are close to reach the centenary of the formulation and first success of General Relativity.
Category: Relativity and Cosmology

[2] viXra:1208.0002 [pdf] replaced on 2012-08-19 04:35:15

The Recent Vision About Preferred Extremals and Solutions of the Modified Dirac Equation

Authors: Matti Pitkanen
Comments: 48 Pages.

During years several approaches to what preferred extremals of Kähler action and solutions of the modified Dirac equation could be have been proposed and the challenge is to see whether at least some of these approaches are consistent with each other. It is good to list various approaches first.

  1. For preferred extremals generalization of conformal invariance to 4-D situation is very attractive approach and leads to concrete conditions formally similar to those encountered in string model. The approach based on basic heuristics for massless equations, on effective 3-dimensionality, and weak form of electric magnetic duality is also promising. An alternative approach is inspired by number theoretical considerations and identifies space-time surfaces as associative or co-associative sub-manifolds of octonionic imbedding space.
  2. There are also several approaches for solving the modified Dirac equation. The most promising approach is assumes that the solutions are restricted on 2-D stringy world sheets and/or partonic 2-surfaces. This strange looking view is a rather natural consequence of number theoretic vision. The conditions stating that electric charge is conserved for preferred extremals is an alternative very promising approach.
The question whether these various approaches are mutually consistent is discussed. It indeed turns out that the approach based on the conservation of electric charge leads under rather general assumptions to the proposal that solutions of the modified Dirac equation are localized on 2-dimensional string world sheets and/or partonic 2-surfaces. Einstein's equations are satisfied for the preferred extremals and this implies that the earlier proposal for the realization of Equivalence Principle is not needed. This leads to a considerable progress in the understanding of super Virasoro representations for super-symplectic and super-Kac-Moody algebra. In particular, the proposal is that super-Kac-Moody currents assignable to string world sheets define duals of gauge potentials and their generalization for gravitons: in the approximation that gauge group is Abelian - motivated by the notion of finite measurement resolution - the exponents for the sum of KM charges would define non-integrable phase factors. One can also identify Yangian as the algebra generated by these charges. The approach allows also to understand the special role of the right handed neutrino in SUSY according to TGD.
Category: Quantum Gravity and String Theory

[1] viXra:1208.0001 [pdf] submitted on 2012-08-01 12:14:18

Gluon Confinement in Yang-Mills Magnetic Monopoles

Authors: Jay R. Yablon
Comments: 2 Pages.

We point out a symmetry property of Yang-Mills magnetic monopoles which makes them plausible baryon candidates.
Category: High Energy Particle Physics