General Science and Philosophy

1412 Submissions

[51] viXra:1412.0281 [pdf] replaced on 2017-01-04 19:39:58

General Principle of Interaction, a Philosophical Concept for Extended Spacetime Geometry-based Complete Unification in Physics

Authors: Victor Paromov
Comments: 22 pages, 3 figures

It has long been expected that a quantum field theory (QFT) “beyond” the Standard Model (SM) will eventually unify gravitation with particle interactions. Unfortunately, this “Theory of everything” remains elusive and problematic. An alternative way is to explain all four types of physical interactions with the geometry of spacetime extended by unseen extra dimensions. The General Principle of Interaction (GPI) presents a philosophical concept that extends the Einsteinian understanding of geometrically deformed vacuum (spacetime) in order to explain particle interactions and establish a consistent basis for the full unification. Surprisingly, this simple concept remained undeveloped to this day. The GPI assumes that the extended spacetime includes one time dimension and three subspaces with seven spatial dimensions: ordinary spacetime (OST), “electromagnetic space” (EMS) with one extra dimension and “nuclear space” (NS) with three extra dimensions, and each type of interaction is governed by the geometry of one of these subspaces. The subspaces are closed and separated by their background curvatures: OST is flat (or almost flat), EMS is more curved, and NS is highly curved and compactified. Thus, gravitational, electromagnetic and strong interactions are governed by spacetime deformations originating separately in these three subspaces (OST, EMS or NS, respectively), and the weak interaction is a type of electromagnetic interaction. Additionally, it is hypothesized that all elementary particles can be described as quantized wave-like vacuum deformations originating in the extra dimensions (NS and/or EMS) with certain secondary effects measurable in the OST (mass, electromagnetic and strong fields). It is expected that the GPI-based unified theory will be able to describe all four types of interaction with the “pure geometry” of the 8D spacetime. The theory should be developed as a minimal extension of the Einstein-Cartan (EC) theory accessing both curvature (via 8D metric) and torsion (via 8D torsion tensor) in the three subspaces. Unlike gravitation, the electromagnetic and strong interactions (i.e. EMS and NS deformations) cannot be described by a classical field theory, as the extra coordinates (NS or EMS) are immeasurable within the OST subspace. Hence, the future theory should accept quantum field methodology (at least to a certain extent), however rejecting the gauge transformation principle and relying solely on the spacetime geometry, which ensures background-independence and avoiding any virtual particles or gauge bosons. This concept promotes the vacuum (spacetime) geometry-based full unification and drastically simplifies the descriptions of particle interactions by reducing the elementary particle set and the total number of interacting fields.
Category: General Science and Philosophy

[50] viXra:1412.0204 [pdf] submitted on 2014-12-19 07:21:37

Information from Theory Towards Science

Authors: Mirela Teodorescu, Adrian Nicolescu, Jozef Novak-Marcincin
Comments: 5 Pages.

Information from Theory towards Science, the professor Stefan Vlăduţescu’s book from University of Craiova, is a confirmation of high intelligence level and propensity of author’s cognition. From various semiotic materials (words, images, gestures, drawings, etc.), following certain principles, under different procedures (operations, actions, movements, maneuvers, mechanisms, strategies) using means (languages, codes, subcodes) and specific tools (knowledge, concepts, categories) adapted aim between earth (with autocorrection by feedback) and firmament (as anticipation by feed-forward) rises an imposing edifice, a cognitive construction: this is information. It has systemic and procedural character and is organized on four coordinates: metric, semantic, structural and pragmatic.
Category: General Science and Philosophy

[49] viXra:1412.0154 [pdf] submitted on 2014-12-09 21:03:03

Review of Searle's Philosophy and Chinese Philosophy by Bo Mou 440p(2008)

Authors: Michael Starks
Comments: 17 Pages.

This book is invaluable as a synopsis of some of the work of one the greatest philosophers of recent times. There is much value in analyzing his responses to the basic confusions of philosophy, and in the generally excellent attempts to connect classical Chinese thought to modern philosophy. I take a modern Wittgensteinian view to place it in perspective.
Category: General Science and Philosophy

[48] viXra:1412.0153 [pdf] submitted on 2014-12-09 21:06:10

Review of 'The Outer Limits of Reason' by Noson Yanofsky 403p(2013)

Authors: Michael Starks
Comments: 13 Pages.

I give a detailed review of 'The Outer Limits of Reason' by Noson Yanofsky 403(2013) from a unified perspective of Wittgenstein and evolutionary psychology. I indicate that the difficulty with such issues as paradox in language and math, incompleteness, undecidability, computability, the brain and the universe as computers etc., all arise from the failure to look carefully at our use of language in the appropriate context and hence the failure to separate issues of scientific fact from issues of how language works. I discuss Wittgenstein's views on incompleteness, paraconsistency and undecidability and the work of Wolpert on the limits to computation.
Category: General Science and Philosophy

[47] viXra:1412.0139 [pdf] submitted on 2014-12-08 02:12:52

The Universal Consciousness on the Universal Code

Authors: Miloje M. Rakocevic
Comments: 28 Pages. An extended version of Addendum 4 in our Book "Genetic Code as a Unique System, presented at our web site.

There are many approaches to investigate the consciousness. In this paper we will show that it makes sense to speak about the consciousness as about the comprehension of something. Furthermore, to speak about the universal consciousness as about the universal comprehension of the universal code; the comprehension from different investigators, in different creativeness, through different epochs.
Category: General Science and Philosophy

[46] viXra:1412.0105 [pdf] submitted on 2014-12-03 23:55:25

An Application of DSmT in Ontology-Based Fusion Systems

Authors: Ksawery Krenc, Adam Kawalec
Comments: 8 Pages.

The aim of this paper is to propose an ontology framework for preselected sensors due to the sensor networks’ needs, regarding a specific task, such as the target’s threat recognition. The problem will be solved methodologically, taking into account particularly non-deterministic nature of functions assigning the concept and the relation sets into the concept and relation lexicon sets respectively and vice-versa. This may effectively enhance the efficiency of the information fusion performed in sensor networks.
Category: General Science and Philosophy

[45] viXra:1412.0104 [pdf] submitted on 2014-12-04 00:07:18

Dempster-Shafer Theory: Combination of Information Using Contextual Knowledge

Authors: Mihai Cristian Florea, Eloie Bosse
Comments: 7 Pages.

The aim of this paper is to investigate how to improve the process of information combination,using the Dempster-Shafer Theory (DST). In presence of an overload of information and an unknown environment,the reliability of the sources of information or the sensors is usually unknown and thus cannot be used to refine the fusion process. In a previous paper [1], the authors have investigated different techniques to evaluate contextual knowledge from a set of mass functions (membership of a BPA to a set of BPAs, relative reliabilities of BPAs, credibility degrees, etc.). The purpose of this paper is to investigate how to use the contextual knowledge in order to improve the fusion process.
Category: General Science and Philosophy

[44] viXra:1412.0103 [pdf] submitted on 2014-12-04 00:09:19

DSm Theory for Fusing Highly Conflicting Esm Reports

Authors: Pierre Valin, Pascal Djiknavorian, Dominic Grenier
Comments: 7 Pages.

Electronic Support Measures consist of passive receivers which can identify emitters coming from a small bearing angle, which, in turn, can be related to platforms that belong to 3 classes: either Friend, Neutral, or Hostile. Decision makers prefer results presented in STANAG 1241 allegiance form, which adds 2 new classes: Assumed Friend, and Suspect. Dezert-Smarandache (DSm) theory is particularly suited to this problem, since it allows for intersections between the original 3 classes. Results are presented showing that the theory can be successfully applied to the problem of associating ESM reports to established tracks, and its results identify when missassociations have occurred and to what extent. Results are also compared to Dempster-Shafer theory which can only reason on the original 3 classes. Thus decision makers are offered STANAG 1241 allegiance results in a timely manner, with quick allegiance change when appropriate and stability in allegiance declaration otherwise.
Category: General Science and Philosophy

[43] viXra:1412.0102 [pdf] submitted on 2014-12-04 00:12:12

Gmti and Imint Data Fusion for Multiple Target Tracking and Classification

Authors: Benjamin Pannetier, Jean Dezert
Comments: 8 Pages.

In this paper, we propose a new approach to track multiple ground target with GMTI (Ground Moving Target Indicator) and IMINT (IMagery INtel- ligence) reports. This tracking algorithm takes into account road network information and is adapted to the out of sequence measurement problem. The scope of the paper is to fuse the attribute type information given by heterogeneous sensors with DSmT (Dezert Smarandache Theory) and to introduce the type results in the tracking process. We show the ground target tracking improvement obtained due to better targets discrimination and an efficient conflicting information management on a realistic scenario.
Category: General Science and Philosophy

[42] viXra:1412.0101 [pdf] submitted on 2014-12-04 00:14:01

Impact of HRR Radar Processing on Moving Target Identification Performance

Authors: Bart Kahler, Erik Blasch
Comments: 8 Pages.

Airborne radar tracking in moving ground vehicle scenarios is impacted by sensor, target, and environmental dynamics. Moving targets can be assessed with 1-D High Range Resolution (HRR) Radar profiles with sufficient signal-to-noise (SNR) present which contain enough feature information to discern one target from another to help maintain track or to identify the vehicle.
Category: General Science and Philosophy

[41] viXra:1412.0100 [pdf] submitted on 2014-12-04 00:16:56

Implication of Culture: User Roles in Information Fusion for Enhanced Situational Understanding

Authors: Erik Blasch, Pierre Valin, Eloi Bosse, Maria Nilsson, Joeri van Laere, Elisa Shahbazian
Comments: 8 Pages.

Information Fusion coordinates large-volume data processing machines to address user needs. Users expect a situational picture to extend their ability of sensing events, movements, and activities. Typically, data is collected and processed for object location (e.g. target identification) and movement (e.g. tracking); however, high-level reasoning or situational understanding depends on the spatial, cultural, and political effects. In this paper,we explore opportunities where information fusion can aid in the selection and processing of the data for enhanced tacit knowledge understanding by (1) display fusion for data presentation (e.g. cultural segmentation), (2)interactive fusion to allow the user to inject a priori knowledge (e..g. cultural values), and (3) associated metrics of predictive capabilities (e.g. cultural networks).In a simple scenario for target identification with deception, cultural information impacts on situational understanding is demonstrated using the Technology-Emotion-Culture-Knowledge (TECK) attributes of the Observe-Orient-Decide-Act (OODA) model.
Category: General Science and Philosophy

[40] viXra:1412.0099 [pdf] submitted on 2014-12-04 00:19:12

Modeling Evidence Fusion Rules by Means of Referee Functions

Authors: Frederic Dambreville
Comments: 8 Pages.

This paper defines a new concept and framework for constructing fusion rules for evidences.This framework is based on a referee function, which does a decisional arbitrament conditionally to basic decisions provided by the several sources of information. A simple sampling method is derived from this frame-work. The purpose of this sampling approach is to avoid the combinatorics which are inherent to the definition of fusion rules of evidences.
Category: General Science and Philosophy

[39] viXra:1412.0098 [pdf] submitted on 2014-12-04 00:21:30

Real World Implementation of Belief Function Theory to Detect Dislocation of Materials in Construction

Authors: Saiedeh N. Razavi, Carl T. Haas, Philippe Vanheeghe, Emmanuel Duflos
Comments: 8 Pages.

Dislocations of construction materials on large sites represent critical state changes. The ability to detect dislocations automatically for tens of thousands of items can ultimately improve project performance significantly. A belief_function -based data fusion algorithm was developed to estimate materials locations and detect dislocations. Dislocation is defined as the change between discrete sequential locations of critical materials such as special valves or fabricated items, on a large construction project.
Category: General Science and Philosophy

[38] viXra:1412.0097 [pdf] submitted on 2014-12-04 00:23:19

Reliability and Combination Rule in the Theory of Belief Functions

Authors: Arnaud Martin
Comments: 8 Pages.

This paper presents a point of view to ad- dress an application with the theory of belief functions from a global approach. Indeed, in a belief application, the definition of the basic belief assignments and the tasks of reduction of focal elements number, discounting, combination and decision, must be thought at the same time. Moreover these tasks can be seen as a general process of belief transfer.
Category: General Science and Philosophy

[37] viXra:1412.0096 [pdf] submitted on 2014-12-04 00:35:56

Association Performance Enhancement Through Classification

Authors: Quirin Hamp, Leonhard Reindl
Comments: 9 Pages.

Association of spatial information about targets is conventionally based on measures such as the Euclidean or the Mahalanobis distance. These approaches produce satisfactory results when targets are more distant than the resolution of the employed sensing principle, but is limited if they lie closer. This paper describes an association method combined with classification enhancing performance. The method not only considers spatial distance, but also information about class membership during a post-processing step.Association of measurements that cannot be uniquely associated to only one estimate, but to multiple estimates, is achieved under the constraint of conflict minimization of the combination of mutual class memberships.
Category: General Science and Philosophy

[36] viXra:1412.0095 [pdf] submitted on 2014-12-04 00:37:42

Decision-Level Fusion Performance Improvement From Enhanced HRR Radar Clutter Suppression

Authors: Bart Kahler, Erik Blasch
Comments: 16 Pages.

Airborne radar tracking in moving ground vehicle scenarios is impacted by sensor, target, and environmental dynamics. Moving targets can be characterized by 1-D High Range Resolution (HRR) Radar profiles with sufficient Signal-to-Noise Ratio (SNR). The amplitude feature information for each range bin of the HRR profile is used to discern one target from another to help maintain track or to identify a vehicle. Typical radar clutter suppression algorithms developed for processing moving ground target data not only remove the surrounding clutter, but a portion of the target signature. Enhanced clutter suppression can be achieved using a Multi-channel Signal Subspace (MSS) algorithm, which preserves target features. In this paper, we (1) exploit extra feature information from enhanced clutter suppression for Automatic Target Recognition (ATR), (2) present a Decision-Level Fusion (DLF) gain comparison using Displaced Phase Center Antenna (DPCA) and MSS clutter suppressed HRR data; and (3) develop a confusion-matrix identity fusion result for Simultaneous Tracking and Identification (STID). The results show that more channels forMSS increase identification over DPCA, result in a slightly noisier clutter suppressed image, and preserve more target features after clutter cancellation. The paper contributions include extending a two-channel MSS clutter cancellation technique to three channels, verifying the MSS is superior to the DPCA technique for target identification, and a comparison of these techniques in a novel multi-look confusion matrix decision-level fusion process.
Category: General Science and Philosophy

[35] viXra:1412.0093 [pdf] submitted on 2014-12-04 00:40:54

A Pragmatic Approach for the use of Dempster-Shafer Theory in Fusing Realistic Sensor Data

Authors: Pierre Valin, Pascal Djiknavorian, Eloi Bosse
Comments: 8 Pages.

This article addresses the performance of Dempster-Shafer (DS)theory, when it is slightly modified to prevent it from becoming too certain of its decision upon accumulation of supporting evidence.Since this is done by requiring that the ignorance never becomes too small, one can refer to this variant of DS theory as Thresholded-DS. In doing so, one ensures that DS can respond quickly to a consistent change in the evidence that it fuses. Only realistic data is fused, where realism is discussed in terms of data certainty and data accuracy, thereby avoiding Zadeh’s paradox. Performance measures of Thresholded-DS are provided for various thresholds in terms of sensor data certainty and fusion accuracy to help designers assess beforehand, by varying the threshold appropriately, the achievable performance in terms of the estimated certainty, and accuracy of the data that must be fused. The performance measures are twofold, first in terms of stability when fused data are consistent, and second in terms of the latency in the response time when an abrupt change occurs in the data to be fused. These two performance measures must be traded off against each other, which is the reason why the performance curves will be very helpful for designers of multi-source information fusion systems using Thresholded-DS.
Category: General Science and Philosophy

[34] viXra:1412.0092 [pdf] submitted on 2014-12-04 00:42:38

Robust Bayesianism: Relation to Evidence Theory

Authors: Stefan Arnborg, Kungliga Tekniska Hogskolan
Comments: 15 Pages.

We are interested in understanding the relationship between Bayesian inference and evidence theory. The concept of a set of probability distributions is central both in robust Bayesian analysis and in some versions of Dempster-Shafer’s evidence theory. We interpret imprecise probabilities as imprecise posteriors obtainable from imprecise likelihoods and priors, both of which are convex sets that can be considered as evidence and represented with, e.g., DS-structures. Likelihoods and prior are in Bayesian analysis combined with Laplace’s parallel composition. The natural and simple robust combination operator makes all pairwise combinations of elements from the two sets representing prior and likelihood. Our proposed combination operator is unique, and it has interesting normative and factual properties. We compare its behavior with other proposed fusion rules, and earlier efforts to reconcile Bayesian analysis and evidence theory. The behavior of the robust rule is consistent with the behavior of Fixsen/Mahler’s modified Dempster’s (MDS) rule, but not with Dempster’s rule. The Bayesian framework is liberal in allowing all significant uncertainty concepts to be modeled and taken care of and is therefore a viable, but probably not the only, unifying structure that can be economically taught and in which alternative solutions can be modeled, compared and explained.
Category: General Science and Philosophy

[33] viXra:1412.0089 [pdf] submitted on 2014-12-04 00:49:22

Implementation of Approximations of Belief Functions for Fusion of Esm Reports Within the DSm Framework

Authors: Pascal Djiknavorian, Pierre Valin, Dominic Grenier
Comments: 8 Pages.

Electronic Support Measures consist of passive receivers which can identify emitters which, in turn, can be related to platforms that belong to 3 classes: Friend,Neutral, or Hostile. Decision makers prefer results presented in STANAG 1241 allegiance form, which adds 2 new classes: Assumed Friend, and Suspect. Dezert- Smarandache (DSm) theory is particularly suited to this problem, since it allows for intersections between the original 3 classes. However, as we know, the DSm hybrid combination rule is highly complex to execute and requires high amounts of resources. We have applied and studied a Matlab implementation of Tessem's k-l-x, Lowrance’s Summarization and Simard’s approximation techniques in the DSm theory for the fusion of ESM reports. Results are presented showing that we can improve on the time of execution while maintaining or getting better rates of good decisions in some cases.
Category: General Science and Philosophy

[32] viXra:1412.0088 [pdf] submitted on 2014-12-04 00:51:14

Is Entropy Enough to Evaluate the Probability Transformation Approach of Belief Function?

Authors: Deqiang Han, Jean Dezert, Chongzhao Han, Yi Yang
Comments: 7 Pages.

In Dempster-Shafer Theory (DST) of evidencee and transferable belief model (TBM), the probability transformation is necessary and crucial for decision-making. The evaluation of the quality of the probability transformation is usually based on the entropy or the probabilistic information content (PIC) measures, which are questioned in this paper. Another alternative of probability transformation approach is proposed based on the uncertainty minimization to verify the rationality of the entropy or PIC as the evaluation criteria for the probability transformation. According to the experimental results based on the comparisons among different probability transformation approaches, the rationality of using entropy or Probabilistic Information Content (PIC) measures to evaluate probability transformation approaches is analyzed and discussed.
Category: General Science and Philosophy

[31] viXra:1412.0087 [pdf] submitted on 2014-12-04 00:53:00

A PCR-Bimm Filter for Maneuvering Target Tracking

Authors: Jean Dezert, Benjamin Pannetier
Comments: 8 Pages.

In this paper we show how to correct and improve the Belief Interacting Multiple Model filter (BIMM) proposed in 2009 by Nassreddine et al.for tracking maneuvering targets. Our improved algorithm,called PCR-BIMM is based on results developed in DSmT (Dezert-Smarandache Theory) framework and concerns two main steps of BIMM: 1) the update of the basic belief assignment of modes which is done by the Proportional Conflict Redistribution Rule no. 5 rather than Smets’ rule (conjunctive rule); 2) the global target state estimation which is obtained from the DSmP probabilistic transformation rather than the commonly used Pignistic transformation. Monte-Carlo simulation results are presented to show the performances of this PCR-BIMM filter with respect to classical IMM and BIMM filters obtained on a very simple maneuvering target tracking scenario.
Category: General Science and Philosophy

[30] viXra:1412.0086 [pdf] submitted on 2014-12-04 00:54:28

Union/intersection Vs. Alternative/conjunction Defining Posterior Hypotheses in C2 Systems

Authors: Ksawery Krenc, Adam Kawalec
Comments: 6 Pages.

This paper discusses the problem of application of the logical operators while defining the posterior hypotheses, that is hypotheses which are not directly created upon the sensor data. In the authors’ opinion the application of the logical sentence operators is constrained to some specific cases, where the sets operators may be applied as well. On the other hand, the sets operators enable to provide much more adequate posterior hypotheses, which results in higher precision of the fusion final decision. In order to present that an analysis has been made and some examples, related to attribute information fusion in C2 systems have been delivered.
Category: General Science and Philosophy

[29] viXra:1412.0084 [pdf] submitted on 2014-12-04 01:39:21

Application of Referee Functions to the Vehicle-Born Improvised Explosive Device Problem

Authors: Frederic Dambreville
Comments: 8 Pages.

We propose a solution to the Vehicle-Born Improvised Explosive Device problem. This solution is based on a modelling by belief functions, and involves the construction of a combination rule dedicated to this problem. The construction of the combination rule is made possible by a tool developped in previous works, which is a generic framework dedicated to the construction of combination rules. This tool implies a tripartite architecture, with respective parts implementing the logical framework, the combination definition (referee function) and the computation processes. Referee functions are decisional arbitrament conditionally to basic decisions provided by the sources of information, and allows rule definitions at logical level adapted to the application.We construct a referee function for the Vehicle-Born Improvised Explosive Device problem, and compare it to reference combinaton rules.
Category: General Science and Philosophy

[28] viXra:1412.0083 [pdf] submitted on 2014-12-04 01:46:56

Change Detection from Remote Sensing Images Based on Evidential Reasoning

Authors: Zhun-ga Liu, Jean Dezert, Gregoire Mercier, Quan Pan, Yong-mei Cheng
Comments: 8 Pages.

Theories of evidence have already been applied more or less successfully in the fusion of remote sensing images. In the classical evidential reasoning, all the sources of evidence and their fusion results are related with the same invariable (static)frame of discernment. Nevertheless, there are possible change occurrences through multi-temporal remote sensing images, and these changes need to be detected efficiently in some applications. The invariable frame of classical evidential reasoning can’t efficiently represent nor detect the changes occurrences from heterogenous remote sensing images. To overcome this limitation, Dynamical Evidential Reasoning (DER) is proposed for the sequential fusion of multi-temporal images. A new state transition frame is defined in DER and the change occurrences can be precisely represented by introducing a state transition operator. The belief functions used in DER are defined similarly to those defined in the Dempster-Shafer Theory (DST). Two kinds of dynamical combination rules working in free model and constrained model are proposed in this new framework for dealing with the different cases. In the final, an experiment using three pieces of real satellite images acquired before and after an earthquake are provided to show the interest of the new approach.
Category: General Science and Philosophy

[27] viXra:1412.0081 [pdf] submitted on 2014-12-04 01:50:52

Edge Detection in Color Images Based on DSmT

Authors: Jean Dezert, Zhun-ga Liu, Gregoire Mercier
Comments: 8 Pages.

In this paper, we present a non-supervised methodology for edge detection in color images based on belief functions and their combination. Our algorithm is based on the fusion of local edge detectors results expressed into basic belief assignments thanks to a flexible modeling, and the proportional conflict redistribution rule developed in DSmT framework. The application of this new belief-based edge detector is tested both on original (noise-free) Lena’s picture and on a modified image including artificial pixel noises to show the ability of our algorithm to work on noisy images too.
Category: General Science and Philosophy

[26] viXra:1412.0080 [pdf] submitted on 2014-12-04 01:52:22

GRP1. a Recursive Fusion Operator for the Transferable Belief Model

Authors: Gavin Powell, Matthew Roberts
Comments: 8 Pages.

Generally, there are problems with any form of recursive fusion based on belief functions. An open world is often required but the empty set can become greedy where, over time, all of the mass will become assigned to it where conjunctive combination is used. With disjunctive combination, all of the mass will move toward the ignorant set over time. In real world applications often the problem will require an open world but due to these limitations it is forced into a closed world solution. GRP1 works iteratively in an open world in a temporally conscious fashion, allowing more recent measurements to have more of an impact. This approach makes it ideal for fusing and classifying streaming data.
Category: General Science and Philosophy

[25] viXra:1412.0079 [pdf] submitted on 2014-12-04 01:54:17

Measurement-to-Track Association for Nontraditional Measurements

Authors: Ronald Mahler
Comments: 8 Pages.

Data fusion algorithms must typically address not only kinematic issues—that is, target tracking—but also nonkinematics—for example, target identification, threat estimation, intent assessment, etc. Whereas kinematics involves traditional measurements such as radar detections, nonkinematics typically involves nontraditional measurements such as quantized data, attributes, features, natural-language statements, and inference rules. The kinematic vs. nonkinematic chasm is often bridged by grafting some expert-system approach (fuzzy logic, Dempster-Shafer, rule-based inference) into a single- or multi-hypothesis multitarget tracking algorithm, using ad hoc methods. The purpose of this paper is to show that conventional measurementto-track association theory can be directly extended to nontraditional measurements in a Bayesian manner. Concepts such as association likelihood, association distance, hypothesis probability, and global nearestneighbor distance are defined, and explicit formulas are derived for specific kinds of nontraditional evidence.
Category: General Science and Philosophy

[24] viXra:1412.0078 [pdf] submitted on 2014-12-04 01:56:02

New Dissimilarity Measures in Evidence Theory

Authors: Deqiang Han, Jean Dezert, Chongzhao Han, Yi Yang
Comments: 7 Pages.

The dissimilarity of evidence, which represent the degree of dissimilarity between bodies of evidence (BOE’s), has attracted more and more research interests and has been used in many applications based on evidence theory. In this paper, some novel dissimilarities of evidence are proposed by using fuzzy sets theory (FST). The basic belief assignments (bba’s) are first transformed to the measures in FST and then by using the dissimilarity (or similarity measure) in FST, the dissimilarities between bba’s are defined. Some numerical examples are provided to verify the rationality of the proposed dissimilarities of evidence.
Category: General Science and Philosophy

[23] viXra:1412.0075 [pdf] submitted on 2014-12-04 02:03:27

A Fuzzy-Cautious OWA Approach with Evidential Reasoning

Authors: Deqiang Han, Jean Dezert, Jean-Marc Tacnet, Chongzhao Han
Comments: 8 Pages.

Multi-criteria decision making (MCDM) is to make decisions in the presence of multiple criteria. To make a decision in the framework of MCDM under uncertainty, a novel fuzzy -Cautious OWA with evidential reasoning (FCOWA-ER) approach is proposed in this paper. Payoff matrix and belief functions of states of nature are used to generate the expected payoffs, based on which, two Fuzzy Membership Functions (FMFs) representing optimistic and pessimistic attitude, respectively can be obtained. Two basic belief assignments (bba’s) are then generated from the two FMFs. By evidence combination, a combined bba is obtained, which can be used to make the decision. There is no problem of weights selection in FCOWA-ER as in traditional OWA. When compared with other evidential reasoning-based OWA approaches such as COWA-ER, FCOWA-ER has lower computational cost and clearer physical meaning. Some experiments and related analyses are provided to justify our proposed FCOWA-ER.
Category: General Science and Philosophy

[22] viXra:1412.0074 [pdf] submitted on 2014-12-04 02:05:50

Hierarchical DSmP Transformation for Decision-Making Under Uncertainty

Authors: Deqiang Han, Jean Dezert, Zhun-ga Liu, Jean-Marc Tacnet
Comments: 8 Pages.

Dempster-Shafer evidence theory is widely used for approximate reasoning under uncertainty; however, the decisionmaking is more intuitive and easy to justify when made in the probabilistic context. Thus the transformation to approximate a belief function into a probability measure is crucial and important for decision-making based on evidence theory framework. In this paper we present a new transformation of any general basic belief assignment (bba) into a Bayesian belief assignment (or subjective probability measure) based on new proportional and hierarchical principle of uncertainty reduction. Some examples are provided to show the rationality and efficiency of our proposed probability transformation approach.
Category: General Science and Philosophy

[21] viXra:1412.0072 [pdf] submitted on 2014-12-04 02:13:15

New Basic Belief Assignment Approximations Based on Optimization

Authors: Deqiang Han, Jean Dezert, Chongzhao Han
Comments: 8 Pages.

The theory of belief function, also called Dempster-Shafer evidence theory, has been proved to be a very useful representation scheme for expert and other knowledge based systems. However, the computational complexity of evidence combination will become large with the increasing of the frame of discernment’s cardinality. To reduce the computational cost of evidence combination, the idea of basic belief assignment (bba) approximation was proposed, which can reduce the complexity of the given bba’s. To realize a good bba approximation, the approximated bba should be similar (in some sense) to the original bba. In this paper, we use the distance of evidence together with the difference between the uncertainty degree of approximated bba and that of the original one to construct a comprehensive measure, which can represent the similarity between the approximated bba and the original one. By using such a comprehensive measure as the objective function and by designing some constraints, the bba approximation is converted to an optimization problem. Comparative experiments are provided to show the rationality of the construction of comprehensive similarity measure and that of the constraints designed.
Category: General Science and Philosophy

[20] viXra:1412.0071 [pdf] submitted on 2014-12-04 02:15:34

A New Evidential C-Means Clustering Method

Authors: Zhun-ga Liu, Jean Dezert, Quan Pan, Yong-mei Cheng
Comments: 8 Pages.

Data clustering methods integrating information fusion techniques have been recently developed in the framework of belief functions. More precisely, the evidential version of fuzzy c-means (ECM) method has been proposed to deal with the clustering of proximity data based on an extension of the popular fuzzy c-means (FCM) clustering method. In fact ECM doesn’t perform very well for proximity data because it is based only on the distance between the object and the clusters’ center to determine the mass of belief of the object commitment. As a result, different clusters can overlap with close centers which is not very efficient for data clustering. To overcome this problem, we propose a new clustering method called belief functions cmeans (BFCM) in this work. In BFCM, both the distance between the object and the imprecise cluster’s center, and the distances between the object and the centers of the involved specific clusters for the mass determination are taken into account. The object will be considered belonging to a specific cluster if it is very close to this cluster’s center, or belonging to an imprecise cluster if it lies in the middle (overlapped zone) of some specific clusters, or belonging to the outlier cluster if it is too far from the data set. Pignistic probability can be applied for the hard decision making support in BFCM. Several examples are given to illustrate how BFCM works, and to show how it outperforms ECM and FCM for the proximity data.
Category: General Science and Philosophy

[19] viXra:1412.0070 [pdf] submitted on 2014-12-04 02:18:33

Soft Electre Tri Outranking Method Based on Belief Functions

Authors: Jean Dezert, Jean-Marc Tacnet
Comments: 8 Pages.

Main decisions problems can be described into choice, ranking or sorting of a set of alternatives. The classicalELECTRE TRI (ET) method is a multicriteria-based outranking sorting method which allows to assign alternatives into a set of predetermined categories. ET method deals with situations where indifference is not transitive and solutions can sometimes appear uncomparable. ET suffers from two main drawbacks: 1) it requires an arbitrary choice of cut step to perform the outranking of alternatives versus profiles of categories, and 2) an arbitrary choice of attitude for final assignment of alternatives into the categories. ET finally gives a final binary (hard) assignment of alternatives into categories. In this paper we develop a soft version of ET method based on belief functions which circumvents the aforementioned drawbacks of ET and allows to obtain both a soft (probabilistic) assignment of alternatives into categories and an indicator of the consistency of the soft solution. This Soft-ET approach is applied on a concrete example to show how it works and to compare it with the classical ET method.
Category: General Science and Philosophy

[18] viXra:1412.0069 [pdf] submitted on 2014-12-04 02:20:01

On The Validity of Dempster-Shafer Theory

Authors: Jean Dezert, Pei Wang, Albena Tchamova
Comments: 6 Pages.

We challenge the validity of Dempster-Shafer Theory by using an emblematic example to show that DS rule produces counter-intuitive result. Further analysis reveals that the result comes from a understanding of evidence pooling which goes against the common expectation of this process. Although DS theory has attracted some interest of the scientific community working in information fusion and artificial intelligence, its validity to solve practical problems is problematic, because it is not applicable to evidences combination in general, but only to a certain type situations which still need to be clearly identified
Category: General Science and Philosophy

[17] viXra:1412.0067 [pdf] submitted on 2014-12-04 02:24:19

On the Consistency of PCR6 with the Averaging Rule and Its Application to Probability Estimation

Authors: Florentin Smarandache, Jean Dezert
Comments: 8 Pages.

Since the development of belief function theory introduced by Shafer in seventies, many combination rules have been proposed in the literature to combine belief functions specially (but not only) in high conflicting situations because the emblematic Dempster’s rule generates counter-intuitive and unacceptable results in practical applications. Many attempts have been done during last thirty years to propose better rules of combination based on different frameworks and justifications. Recently in the DSmT (Dezert-Smarandache Theory) framework, two interesting and sophisticate rules (PCR5 and PCR6 rules) have been proposed based on the Proportional Conflict Redistribution (PCR) principle. These two rules coincide for the combination of two basic belief assignments, but they differ in general as soon as three or more sources have to be combined altogether because the PCR used in PCR5 and in PCR6 are different. In this paper we show why PCR6 is better than PCR5 to combine three or more sources of evidence and we prove the coherence of PCR6 with the simple Averaging Rule used classically to estimate the probability based on the frequentist interpretation of the probability measure. We show that such probability estimate cannot be obtained using Dempster-Shafer (DS) rule, nor PCR5 rule.
Category: General Science and Philosophy

[16] viXra:1412.0066 [pdf] submitted on 2014-12-04 02:26:49

Design of Dynamic Multiple Classifier Systems Based on Belief Functions

Authors: Deqiang Han, X. Rong Li, Shaoyi Liang
Comments: 8 Pages.

The technique of Multiple Classifier Systems (MCSs), which is a kind of decision-level information fusion,has fast become popular among researchers to fuse multiple classification outputs for better classification accuracy. In MCSs, there exist various kinds of uncertainties such as the ambiguity of the output of individual member classifier and the inconsistency among outputs of member classifiers. In this paper, we model the uncertainties inMCSs based on the theory of belief functions. The outputs of member classifiers are modeled using belief functions. A new measure of diversity in member classifiers is established using the distance of evidence, and the fusion rule adopted for MCSs is Demspter’s rule of combination. The construction of MCSs based on the proposed diversity measure is a dynamic procedure and can achieve better performance than using existing diversity measures. Experimental results and related analyses show that our proposed measure and approach are rational and effective.
Category: General Science and Philosophy

[15] viXra:1412.0065 [pdf] submitted on 2014-12-04 02:32:20

A DSmT Based Combination Scheme for Multi-Class Classification

Authors: Nassim Abbas, Youcef Chibani, Zineb Belhadi, Mehdia Hedir
Comments: 8 Pages.

This paper presents a new combination scheme for reducing the number of focal elements to manipulate in order to reduce the complexity of the combination process in the multiclass framework. The basic idea consists in using of p sources of information involved in the global scheme providing p kinds of complementary information to feed each set of p one class support vector machine classifiers independently of each other, which are designed for detecting the outliers of the same target class, then, the outputs issued from this set of classifiers are combined through the plausible and paradoxical reasoning theory for each target class. The main objective of this approach is to render calibrated outputs even when less complementary responses are encountered. An inspired version of Appriou’s model for estimating the generalized basic belief assignments is presented in this paper. The proposed methodology allows decomposing a n-class problem into a series of n-combination, while providing n-calibrated outputs into the multi-class framework. The effectiveness of the proposed combination scheme with proportional conflict redistribution algorithm is validated on digit recognition application and is compared with existing statistical, learning, and evidence theory based combination algorithms.
Category: General Science and Philosophy

[14] viXra:1412.0064 [pdf] submitted on 2014-12-04 02:53:47

Evidence Combination based on CSP Modeling

Authors: Faouzi Sebbak, Farid Benhammadi, Aicha Mokhtari, Abdelghani Chibanizand Yacine Amirat
Comments: 8 Pages.

The evidence theory and its variants are mathematical formalisms used to represent uncertain as well as ambiguous data. The evidence combination rules proposed in these formalisms agree with Bayesian probability calculus in special cases but not in general. To get more reconcilement between the belief functions theory with the Bayesian probability calculus, this work proposes a new way of combining beliefs to estimate combined evidence. This approach is based on the Constraint Satisfaction Problem modeling. Thereafter, we combine all solutions of these constraint problems using Dempster’s rule. This mathematical formalism is tested using information system security risk simulations. The results show that our model produces intuitive results and agrees with the Bayesian probability calculus.
Category: General Science and Philosophy

[13] viXra:1412.0063 [pdf] submitted on 2014-12-04 03:18:45

Image Registration Based on Evidential Reasoning

Authors: Deqiang Han, Jean Dezert, Shicheng Li, Chongzhao Han, Yi Yang
Comments: 8 Pages.

Image registration is a crucial and necessary step before image fusion. It aims to achieve the optimal match between two or more images of the same scene taken at different times, from different viewpoints, and/or by different sensors. In the procedure of image registration, several types of uncertainty will be encountered, e.g., the selection of control points and the distance or the dissimilarity measures used for image matching. In this paper, we model these uncertainty in image registration using the theory of belief functions. By jointly using the pixel level and feature level information, more effective image registrations are accomplished. Experimental results, comparisons and related analyses illustrate the effectiveness of our evidential reasoning based image registration approach.
Category: General Science and Philosophy

[12] viXra:1412.0062 [pdf] submitted on 2014-12-04 03:20:48

A Markov Multi-Phase Transferable Belief Model: an Application for Predicting Data Exfiltration Apts

Authors: Georgios Ioannou, Panos Louvieris, Natalie Clewley, Gavin Powell
Comments: 8 Pages.

eXfiltration Advanced Persistent Threats (XAPTs) increasingly account for incidents concerned with intelligence information gathering by malicious adversaries. This research exploits the multi-phase nature of an XAPT, mapping its phases into a cyber attack kill chain. A novel Markov Multi-Phase Transferable Belief Model (MM-TBM) is proposed and demonstrated for fusing incoming evidence from a variety of sources which takes into account conflicting information. The MM-TBM algorithm predicts a cyber attacker’s actions against a computer network and provides a visual representation of their footsteps.
Category: General Science and Philosophy

[11] viXra:1412.0061 [pdf] submitted on 2014-12-04 03:22:40

New Evidence Combination Rules for Activity Recognition in Smart Home

Authors: Faouzi Sebbak, Farid Benhammadi, Abdelghani Chibani, Yacine Amirat, Aicha Mokhtari
Comments: 7 Pages.

The evidence theory and its propositional conflict redistribution variant rules are mathematical formalisms used to represent uncertain as well as ambiguous data. The evidence combination rules proposed in these formalisms do not satisfy the idempotence property. However, in a variety of applications, it is desirable that the evidence combination rules satisfy this property. In response to this challenge, the present work proposes a new formalism for reasoning under uncertainty based on new consensus and conflicting of evidence concepts. This mathematical formalism is evaluated using a real world activity recognition problem in smart home environment. The results show that one rule of our formalism respects the idempotence property and improves the accuracy of activity recognition.
Category: General Science and Philosophy

[10] viXra:1412.0060 [pdf] submitted on 2014-12-04 03:24:20

New Neighborhood Classifiers Based on Evidential Reasoning

Authors: Deqiang Han, Jean Dezert, Chongzhao Han, Yi Yang
Comments: 8 Pages.

Neighborhood based classifiers are commonly used in the applications of pattern classification. However, in the implementation of neighborhood based classifiers, there always exist the problems of uncertainty. For example, when one use k-NN classifier, the parameter k should be determined, which can be big or small. Therefore, uncertainty problem occurs for the classification caused by the k value. Furthermore, for the nearest neighbor (NN) classifier, one can use the nearest neighbor or the nearest centroid of all the classes, so different classification results can be obtained. This is a type of uncertainty caused by the local and global information used, respectively. In this paper, we use theory of belief function to model and manage the two types of uncertainty above. Evidential reasoning based neighborhood classifiers are proposed. It can be experimentally verified that our proposed approach can deal efficiently with the uncertainty in neighborhood classifiers.
Category: General Science and Philosophy

[9] viXra:1412.0059 [pdf] submitted on 2014-12-04 03:26:44

Urref Reliability Versus Credibility in Information Fusion (Stanag 2511)

Authors: Erik Blasch, Kathryn B. Laskey, Anne-Laure Jousselme, Valentina Dragos, Jean Dezert, Paulo C. G. Costa
Comments: 8 Pages.

For many operational information fusion systems, both reliability and credibility are evaluation criteria for collected information. The Uncertainty Representation and Reasoning Evaluation Framework (URREF) is a comprehensive ontology that represents measures of uncertainty. URREF supports standards such as the NATO Standardization Agreement (STANAG) 2511, which incorporates categories of reliability and credibility. Reliability has traditionally been assessed for physical machines to support failure analysis. Source reliability of a human can also be assessed. Credibility is associated with a machine process or human assessment of collected evidence for information content. Other related constructs for URREF are data relevance and completeness. In this paper, we seek to develop a mathematical relation of weight of evidence using credibility and reliability as criteria for characterizing uncertainty in information fusion systems.
Category: General Science and Philosophy

[8] viXra:1412.0058 [pdf] submitted on 2014-12-04 03:28:54

Why Dempster’s Fusion Rule is not a Generalization of Bayes Fusion Rule

Authors: Jean Dezert, Albena Tchamova, Deqiang Han, Jean-Marc Tacnet
Comments: 8 Pages.

In this paper, we analyze Bayes fusion rule in details from a fusion standpoint, as well as the emblematic Dempster’s rule of combination introduced by Shafer in his Mathematical Theory of evidence based on belief functions. We propose a new interesting formulation of Bayes rule and point out some of its properties. A deep analysis of the compatibility of Dempster’s fusion rule with Bayes fusion rule is done. We show that Dempster’s rule is compatible with Bayes fusion rule only in the very particular case where the basic belief assignments (bba’s) to combine are Bayesian, and when the prior information is modeled either by a uniform probability measure, or by a vacuous bba. We show clearly that Dempster’s rule becomes incompatible with Bayes rule in the more general case where the prior is truly informative (not uniform, nor vacuous). Consequently, this paper proves that Dempster’s rule is not a generalization of Bayes fusion rule.
Category: General Science and Philosophy

[7] viXra:1412.0057 [pdf] submitted on 2014-12-04 03:31:08

An Alternative Combination Rule for Evidential Reasoning

Authors: Faouzi Sebbak, Farid Benhammadi, M’hamed Mataoui, Sofiane Bouznadx, Yacine Amirat
Comments: 8 Pages.

In order to have a normal behavior in combination of bodies of evidence, this paper proposes a new combination rule. This rule includes the cardinality of focal set elements in conjunctive operation and the conflict redistribution to all steps. Based on the focal set cardinalities, the conflict redistribution is assigned using factors. The weighted factors are computed from the original and the conjunctive masses assigned to each focal element. This strategy forces the conflict redistribution in favor of the more committed hypothesis. Our method is evaluated and compared with some numerical examples reported in the literature. As result, this rule redistributes the conflict in favor of the more committed hypothesis and gives intuitive interpretation for combining multiple information sources with coherent results.
Category: General Science and Philosophy

[6] viXra:1412.0055 [pdf] submitted on 2014-12-04 03:34:29

Can We Trust Subjective Logic for Information Fusion?

Authors: Jean Dezert, Albena Tchamova, Deqiang Han, Jean-Marc Tacnet
Comments: 8 Pages.

In this paper, we provide a deep examination of the main bases of Subjective Logic (SL) and reveal serious problems with them. A new interesting alternative way for building a normal coarsened basic belief assignment from a refined one is also proposed. The defects in the SL fusion rule and the problems in the link between opinion and Beta probability density functions are also analyzed. Some numerical examples and related analyses are provided to justify our viewpoints.
Category: General Science and Philosophy

[5] viXra:1412.0054 [pdf] submitted on 2014-12-04 03:36:21

Characterization of Hard and Soft Sources of Information: a Practical Illustration

Authors: Anne-Laure Jousselme, Anne-Claire Boury, Brisset Benoıt Debaque, Donald Prevost
Comments: 8 Pages.

Physical sensors (hard sources) and humans (soft sources) have complementary features in terms of perception, reasoning, memory. It is thus natural to combine their associated information for a wider coverage of the diversity of the available information and thus provide an enhanced situation awareness for the decision maker. While the fusion domain mainly considers (although not only) the processing and combination of information from hard sources, conciliating these two broad areas is gaining more and more interest in the domain of hard and soft fusion. In order to better understand the diversity and specificity of sources of information, we propose a functional model of a source of information, and a structured list of dimensions along which a source of information can be qualified. We illustrate some properties on a real data gathered from an experiment of light detection in a fog chamber involving both automatic and human detectors.
Category: General Science and Philosophy

[4] viXra:1412.0052 [pdf] submitted on 2014-12-04 03:39:39

Evaluations of Evidence Combination Rules in Terms of Statistical Sensitivity and Divergence

Authors: Deqiang Han, Jean Dezert, Yi Yang
Comments: 7 Pages.

The theory of belief functions is one of the most important tools in information fusion and uncertainty reasoning.Dempster’s rule of combination and its related modified versions are used to combine independent pieces of evidence. However,until now there is still no solid evaluation criteria and methods for these combination rules. In this paper, we look on the evidence combination as a procedure of estimation and then we propose a set of criteria to evaluate the sensitivity and divergence of different combination rules by using for reference the mean square error (MSE), the bias and the variance. Numerical examples and simulations are used to illustrate our proposed evaluation criteria. Related analyses are also provided.
Category: General Science and Philosophy

[3] viXra:1412.0051 [pdf] submitted on 2014-12-04 03:42:30

The Theory of Belief Functions is One of the Most Important Tools in Information Fusion and Uncertainty Reasoning. Dempster’s Rule of Combination and Its Related Modified Versions Are Used to Combine Independent Pieces of Evidence. However, Until Now Ther

Authors: Zhun-ga Liua, Quan Pan, Jean Dezert, Gregoire Mercier, Yong Liu
Comments: 8 Pages.

Information fusion technique like evidence theory has been widely applied in the data lassification to improve the performance of classifier. A new fuzzy-belief K-nearest neighbor (FBK-NN) classifier is proposed based on evidential reasoning for dealing with uncertain data. In FBK-NN, each labeled sample is assigned with a fuzzy membership to each class according to its neighborhood. For each input object to classify, K basic belief assignments (BBA’s) are determined from the distances between the object and its K nearest neighbors taking into account the neighbors’ memberships. The K BBA’s are fused by a new method and the fusion results are used to finally decide the class of the query object. FBK-NN method works with credal classification and discriminate specific classes, metaclasses and ignorant class. Meta-classes are defined by disjunction of several specific classes and they allow to well model the partial imprecision of classification of the objects. The introduction of meta-classes in the classification procedure reduces the misclassification errors. The ignorant class is employed for outliers detections. The effectiveness of FBK-NN is illustrated through several experiments with a comparative analysis with respect to other classical methods.
Category: General Science and Philosophy

[2] viXra:1412.0050 [pdf] submitted on 2014-12-04 03:44:18

Multi-level Fusion of Hard and Soft Information

Authors: Joachim Biermann, Jesus Garcia, Ksawery Krenc
Comments: 8 Pages.

Driven by the underlying need for a yet to be developed framework for fusing heterogeneous data and information at different semantic levels coming from both sensory and human sources, we present some results of the research being conducted within the NATO Research Task Group IST-106 / RTG-051 on “Information Filtering and Multi Source Information Fusion”. As part of this on-going effort, we discuss here a first outcome of our investigation on multi-level fusion. It deals with removing the first hurdle between data/information sources and processes being at different levels: representation. Our contention here is that a common representation and description framework is the premise for enabling processing overarching different semantic levels. To this end we discuss here the use of the Battle Management Language (BML) as a way (“lingua franca”) to encode sensory data, a priori and contextual knowledge, both as hard and soft data.
Category: General Science and Philosophy

[1] viXra:1412.0033 [pdf] submitted on 2014-12-02 00:36:39

Uncertainty Evaluation for an Ultrasonic Data Fusion Based Target Differentiation Problem Using Generalized Aggregated Uncertainty Measure 2

Authors: M. Khodabandeh, A. Mohammad-Shahri
Comments: 11 Pages.

The purpose of this paper is uncertainty evaluation in a target differentiation problem. In the problem ultrasonic data fusion is applied using Dezert-Smarandache theory (DSmT). Besides of presenting a scheme to target differentiation using ultrasonic sensors, the paper evaluates DSmTbased fused results in uncertainty point of view. The study obtains pattern of data for targets by a set of two ultrasonic sensors and applies a neural network as target classifier to these data to categorize the data of each sensor. Then the results are fused by DSmT to make final decision. The Generalized Aggregated Uncertainty measure named GAU2, as an extension to the Aggregated Uncertainty (AU), is applied to evaluate DSmT-based fused results. GAU2, rather than AU, is applicable to measure uncertainty in DSmT frameworks and can deal with continuous problems. Therefore GAU2 is an efficient measure to help decision maker to evaluate more accurate results and smoother decisions are made in final decisions by DSmT in comparison to DST.
Category: General Science and Philosophy