Previous months:
2009 - 0908(1)
2010 - 1003(2) - 1004(2) - 1008(1)
2011 - 1101(3) - 1106(3) - 1108(1) - 1109(1) - 1112(2)
2012 - 1202(1) - 1208(3) - 1210(2) - 1211(1) - 1212(2)
2013 - 1301(1) - 1302(2) - 1303(6) - 1305(2) - 1306(6) - 1308(1) - 1309(1) - 1310(5) - 1311(1) - 1312(1)
2014 - 1403(3) - 1404(3) - 1405(25) - 1406(2) - 1407(2) - 1408(3) - 1409(3) - 1410(3) - 1411(1) - 1412(2)
2015 - 1501(2) - 1502(4) - 1503(3) - 1504(3) - 1505(2) - 1506(1) - 1507(1) - 1508(1) - 1509(5) - 1510(6) - 1511(1)
2016 - 1601(12) - 1602(4) - 1603(7) - 1604(1) - 1605(8) - 1606(6) - 1607(6) - 1608(3) - 1609(3) - 1610(2) - 1611(3) - 1612(4)
2017 - 1701(3) - 1702(4) - 1703(1) - 1704(1) - 1706(2) - 1707(5) - 1708(4) - 1709(1) - 1710(4) - 1711(1) - 1712(6)
2018 - 1801(4) - 1802(5) - 1803(2) - 1804(3) - 1805(4) - 1806(1) - 1807(5) - 1808(2) - 1809(4) - 1810(5) - 1811(3) - 1812(2)
2019 - 1901(5) - 1902(7) - 1903(11) - 1904(8) - 1905(6) - 1906(6) - 1907(7) - 1908(6) - 1909(2) - 1910(2) - 1911(6) - 1912(3)
2020 - 2001(6) - 2002(5) - 2003(4) - 2004(9) - 2005(3) - 2006(5) - 2007(5) - 2008(4) - 2009(3) - 2010(3) - 2011(1) - 2012(2)
2021 - 2101(5) - 2102(3) - 2103(2) - 2104(1) - 2106(4) - 2107(1) - 2108(3) - 2109(1) - 2110(1) - 2112(1)
2022 - 2201(5) - 2202(2) - 2204(2) - 2205(2) - 2206(2) - 2207(4) - 2208(1) - 2209(1) - 2211(5) - 2212(5)
2023 - 2302(1) - 2303(2) - 2304(1) - 2305(4) - 2306(2) - 2307(5) - 2308(3) - 2309(1) - 2312(2)
2024 - 2401(2) - 2402(5) - 2403(4) - 2404(2) - 2405(2) - 2406(4) - 2407(1) - 2408(1) - 2409(3) - 2410(3) - 2411(1) - 2412(2)
Any replacements are listed farther down
[465] viXra:2412.0133 [pdf] submitted on 2024-12-21 15:00:11
Authors: Ren Sakai
Comments: 14 Pages.
This paper presents a novel approach for solving NP problems by integrating advanced mathematical theories, extensive experimental validation, efficient utilization of computational resources, and interdisciplinary methods. By leveraging recent advancements in number theory and graph theory along with optimized computational techniques, we aim to provide a comprehensive framework that addresses the complexities of NP problems, ultimately leading to their complete resolution.
Category: Data Structures and Algorithms
[464] viXra:2412.0117 [pdf] submitted on 2024-12-19 12:31:55
Authors: Ren Sakai
Comments: 12 Pages.
This paper presents a comprehensive approach to solving NP problems by integrating advanced mathematical theories, extensive experimental validation, efficient utilization of computational resources, and interdisciplinary methods. By leveraging recent advancements in number theory, graph theory, and various computational techniques, we aim to provide a robust framework that addresses the complexities of NP problems and progresses towards the ultimate resolution of the P=NP question.Key contributions of this research include the development and optimization of new polynomial-time algorithms for specific NP-complete problems, such as the Hamiltonian Cycle and Knapsack Problems. We also incorporate probabilistic and quantum computing methods to enhance algorithmic efficiency. Extensive simulations and large-scale experiments validate the practical applicability and scalability of these algorithms across diverse datasets and real-world applications.Additionally, this paper explores the integration of interdisciplinary approaches, such as dynamic systems theory, ergodic theory, Monte Carlo methods, and insights from fields like biology, economics, and social network analysis. These combined efforts create a multifaceted strategy for tackling NP problems, making significant strides towards a potential solution to one of the most critical questions in theoretical computer science.
Category: Data Structures and Algorithms
[463] viXra:2411.0118 [pdf] submitted on 2024-11-18 14:51:24
Authors: Samir Bouftass
Comments: 8 Pages.
In this paper, we show that subset sum problem is solvable in polynomial time.
Category: Data Structures and Algorithms
[462] viXra:2410.0144 [pdf] submitted on 2024-10-23 18:37:17
Authors: Fred Strohm
Comments: 33 Pages.
A polynomial time algorithm for solving the Traveling Salesman Problem is described.
Category: Data Structures and Algorithms
[461] viXra:2410.0133 [pdf] submitted on 2024-10-22 22:00:06
Authors: Yong Dok Pyon, Chol Ju Han, Kwang Chol Bang, Song Jin Jong
Comments: 6 Pages.
When you are going to create a Node-based project, it will be necessary to find solutions for the management of tens or hundreds of thousands of node modules and sources, and for the protection of the source code in the distribution. Using the existing versions of various products such as NW.js, you could find those solutions. In this paper, we proposed a file request processing method using the encrypted file set on the basis of the analysis of the file request processing of the open source project Node.js, implement the managing method of the source code and resource files, verify the practicality and effectiveness of the proposed methods, and newly established the distribution and protection mode of Node-based product.
Category: Data Structures and Algorithms
[460] viXra:2410.0130 [pdf] submitted on 2024-10-22 21:55:54
Authors: YunHak Ri, TaeHyok Kim, YongHo Kim
Comments: 21 Pages.
This paper analyses the disturbance error coefficients for the active disturbance rejection control(ADRC) system and proposes a new method to improve the disturbance rejection performance using the m-th order extended state observer(ESO) in order to guarantee the high-precision stabilisation performance of control systems. Using the first order extended state observer in the ADRC system, the disturbance rejection performance is limited due to the disturbance error coefficients. As well, the m-th order extended state observer was hardly used for the active disturbance rejection control. In this paper the relationship between error coefficients and gain coefficients of observer and the relationship between error coefficients and natural angular frequency of observer of the ADRC system are analysed for the plant representing in the canonical form according to the extended state order m, and the disturbance rejection performance on the constant, ramp, parabolic and harmonic disturbances is analysed comparing with the results in [32]. The proposed method shows that the constant, ramp and parabolic disturbances are rejected perfectly according to the increase of extended state order m while the rejection performance is improved for the harmonic disturbance of the same frequency by the multiple of m. Taking the merits of principle and method of ADRC and introducing the m-th order ESO the disturbance rejection performance can be improved by making the error coefficients to be ‘zero’ according to the extended state order m.
Category: Data Structures and Algorithms
[459] viXra:2409.0090 [pdf] submitted on 2024-09-16 00:21:44
Authors: Jyotirmay Kirtania
Comments: 19 Pages.
Adverse Drug Reactions (ADRs) are a leading cause of hospital admissions and healthcare costs. Traditional methods of ADR reporting often rely on post-marketing surveillance, and manual reporting of ADRs to the local or national pharmacovigilance agencies for causality assessment and final reporting to the WHO. High-income countries have their own national (i.e., USFDA) and regional (i.e., European Medicines Agency / EMA) pharmacovigilance agencies. However, this process is slow and inefficient. This article proposes a novel framework for integrating ADR detection into clinical workflows using Electronic Medical Record (EMR) systems, crowdsourced reporting from patients and healthcare professionals, and graph theory for generating automated ADR signals and reports to the local or national pharmacovigilance agencies. The system leverages automated data collection from EMRs (drug prescriptions, clinical notes) by EMR data scraping, integrating ADR dictionaries and drug databases to automate the generation of ranked ADR signals. By applying graph theory, the system filters and upranks connections between drugs and ADRs, considering the temporal relationship between drug administration and ADR occurrence. This automated approach offers a significant improvement in ADR reporting, enabling faster detection and more accurate predictions. Methodologies, framework visualizations and python code snippets are included to aid implementation.
Category: Data Structures and Algorithms
[458] viXra:2409.0079 [pdf] submitted on 2024-09-15 23:04:23
Authors: Oluwashola Aremu, Dan Taiye
Comments: 14 Pages.
This study explores the application of neural networks to predict product delivery times in procurement processes, utilizing a large synthetic dataset. As timely delivery is crucial for supply chain efficiency, accurate prediction of procurement timelines can significantly enhance operational planning and resource allocation. Our research employs a multi-layer neural network model trained on a synthetically generated dataset of 1 million entries. The dataset incorporates key procurement attributes including purchase value, complexity, procurement method, product type, number of potential suppliers, urgency, organizational size, team experience, budget availability, geographical location, season, and industry sector. By using synthetic data, we overcome common limitations in procurement research such as data scarcity and confidentiality issues, while still capturing the complex interrelationships between variables. The neural network model demonstrates promising results in predicting delivery times, outperforming traditional linear regression models. Our findings suggest that certain attributes, such as complexity, procurement method, geographical location and budget availability have a more significant impact on delivery time predictions. The study also highlights the potential of machine learning techniques in procurement analytics and decision support. While based on synthetic data, this research provides a foundation for future studies using real-world procurement data. It also offers insights into the key factors influencing procurement timelines and demonstrates the potential of neural networks in enhancing procurement efficiency.
Category: Data Structures and Algorithms
[457] viXra:2409.0053 [pdf] submitted on 2024-09-10 11:00:32
Authors: Taha Sochi
Comments: 8 Pages.
We envisage theoretical structures (especially in pure mathematics and theoretical physics) as networks made of elementary propositions (representing nodes) interconnected through deductive relationships (representing throats). This vision can be exploited as a basis for employing traditional network modeling techniques in the automated search for new theorems as well as for automated proving of proposed theorems and conjectures. This deductive, deterministic and intuitive approach can replace some of the conventional approaches (which are generally more sophisticated and elaborate and hence they are more expensive) in certain areas of automated and assisted theorem proving in addition to its benefit in the automated search for novel theorems. However, we admit that it has a number of limitations and shortcomings although this similarly applies to other methods in this field; moreover some of these limitations and shortcomings can be overcome by the reformulation of certain theoretical structures where we rely for the viability of this reformulation on our perception of theoretical structures as elaborate high-level linguistic systems.
Category: Data Structures and Algorithms
[456] viXra:2408.0017 [pdf] submitted on 2024-08-05 21:03:47
Authors: Hyeon-Ho Song
Comments: 10 Pages. (Note by viXra Admin: Please cite and list scientific references)
This paper proves that p=np. This paper consists of a basic frame and examples. The pnp problem is related to the characteristics of the problem, and an attempt was made to use this to make the problem contradictory and solve it through the reduction method.
Category: Data Structures and Algorithms
[455] viXra:2407.0093 [pdf] submitted on 2024-07-14 12:36:13
Authors: Pieter Cawood
Comments: 4 Pages.
This paper proposes a modified deadlock avoidance method for the TA-Prioritized algorithm by Liu et al. [1]. Their algorithms were developed for the offline Multi-Agent Pickup and Delivery (MAPD) problems where a team of agents have delivery tasks with known release times (when the tasks are ready for pickup.) Offline MAPD problems exist in settings such as warehouses and factories where the release times of tasks are known in advance and Liu et al. [1] make use of this information to compute a good task sequence for each agent using a travelling salesman problem (TSP) solver. The task sequences are then used to plan agent paths accordingly and the deadlock avoidance method proposed attempts holding the pickup locations (keeping an agent stationed at a vertex) if an agent has reached it before the release time and retrying a 1 timestep delayed path from the agent’s initial parking location if any of the agent’s tasks’ path finding fails.
Category: Data Structures and Algorithms
[454] viXra:2406.0157 [pdf] submitted on 2024-06-26 19:20:20
Authors: Junho Eom
Comments: 16 Pages. 2 figures
At least one prime less than n (n >= 2) is known to be used as a factor for composites between n and n^2, and this is explained by prime wave analysis. In this paper, the prime wave analysis is modified with a modular operator and applied to finding new primes within a limited boundary. In results, using the known primes less than 3, the composites were eliminated, and collected remaining prime candidates within a limited boundary between 3 and 3^2. The boundary was sequentially extended from 3^2 to 9^2, 81^2, and 6561^2 by finding 2, 18, 825, and 2606318 prime candidates; these candidates were verified as new primes using the using online databases. In addition, the boundary was extended from 6561^2 to 43046721^2 and the serial new primes were also found within a randomly selected boundary between 6561^2 and 43046721^2. In general, it was concluded that the prime wave analysis modified with a modular operator could be a practical technique for finding new primes within a limited boundary.
Category: Data Structures and Algorithms
[453] viXra:2406.0088 [pdf] submitted on 2024-06-18 15:59:15
Authors: Bo Tian
Comments: 18 Pages.
In this paper, a new algorithm for solving MEB problem is proposed based on newunderstandings on the geometry property of minimal enclosing ball problem. A substitution ofRitter's algorithm is proposed to get approximate results with higher precision, and a 1+ϵapproximation algorithm is presented to get approximation with specified precision within muchless time comparing with present algorithms.
Category: Data Structures and Algorithms
[452] viXra:2406.0050 [pdf] submitted on 2024-06-11 20:02:54
Authors: Chun-Hu Cui, He-Song Cui
Comments: 38 Pages.
In DeFi (Decentralized Finance) applications, and in dApps (Decentralized Application) generally, it is common to periodically pay interest to users as an incentive, or periodically collect a penalty from them as a deterrent. If we view the penalty as a negative reward, both the interest and penalty problems come down to the problem of distributing rewards. Reward distribution is quite accomplishable in financial management using general computers, but on a blockchain, where computational resources are inherently expensive and the amount of computation per transaction is absolutely limited with a predefined, uniform quota, not only do the system administrators have to pay heavy gas fees if they handle rewards of many users one by one, but the transaction may also be terminated on the way. The computational quota is generally large enough, but cannot guarantee processing an unknown number of users. We propose novel algorithms that solve Simple Interest, Simple Burn, Compound Interest, and Compound Burn tasks, which are a typical component of DeFi applications. If we put numerical errors aside, these algorithms realize accurate distribution of rewards to an unknown number of users, with no approximation, while adhering to the computational quota per transaction. For those who might already be using similar algorithms, we prove the algorithms in a rigorous manner so that they can be transparently presented to users and stockholders. We also introduce reusable concepts and notations, and demonstrate how they can be efficiently used in reasoning and inference of dApps. We demonstrate, through simulated tests spanning over 128 simulated years, that the numerical errors do not grow to a level that is dangerous in practice.
Category: Data Structures and Algorithms
[451] viXra:2406.0046 [pdf] submitted on 2024-06-10 20:05:12
Authors: Junho Eom
Comments: 16 Pages. 4 figures
Primes less a given number n (n >= 2) determines new primes within a limited area increased with a square (n2) or decreased with a square root (sqrt()). As the area is extended, the number of primes is also changed and controlled within an extended area boundary or number boundary, n to n2 or n to sqrt(). The structure of a number boundary is applied to the Euler product and helps to characterize the Euler’s prime boundary between n and (n2 - 1). The characterized Euler product is used to characterize the non-trivial zeroes derived in an elementary way of Riemann zeta function. Then, the characterized Euler product and non-trivial zeroes are discussed regarding their potential number boundaries. Overall, it is concluded that the characteristic of a number boundary can represent the characteristic of primes, especially the number of primes. As the number boundary is characterized by the increased or decreased exponent while the base or given number n is fixed, it is concluded that the pattern of exponent in the number boundary would be a key to understanding the pattern of primes.
Category: Data Structures and Algorithms
[450] viXra:2405.0074 [pdf] submitted on 2024-05-14 21:17:31
Authors: Han Yong Gil, Min Hyok, Kim Song Hyok, Choe Yong Su, Song Kwang Hyok, Choe Tae Hyok, Kim Jing Yong
Comments: 15 Pages.
This paper aims to describe the development and application of Net-oriented System Description Language (NSDL), a new and independent tool of software development combined with advantages of Petri net and object-oriented programming language VB. Unlike previous tools, the transitions of custom controls such as Textbox, Table, Graph, Button and Checkbox, etc. and the extension (or restriction) of place, transition and arc were introduced to improve modeling capability of Petri net. NSDL was enhanced the flexibility, convenience and extensibility of the software development by Visual Basic(VB) language support based on Microsoft Net Framework 4.0 library that is easy to use and learning. Approximation of BP neural net was carried out to validate NSDL’s effectiveness in three manners. NSDL can be used in the development of software or modelling of complex information systems with its great modeling capability.
Category: Data Structures and Algorithms
[449] viXra:2405.0040 [pdf] submitted on 2024-05-07 21:07:59
Authors: Hak Kun Ri, Chol Hun Pak, Nam Song An
Comments: 7 Pages.
Genetic Algorithm (GA) is one of most popular swarm based evolutionary search algorithms that involve multiple data independent computations. Such computations can be made in parallel processing method on GPU cores using Compute Unified Design Architecture (CUDA) platform. In this paper, various operations of GA such as fitness evaluation, selection, crossover and mutation, etc. are implemented in parallel on GPU cores and then performance is compared with its serial implementation. Result shows that the overall computational time can substantially be decreased by parallel implementation on GPU cores. The proposed implementations resulted in 1.18 to 3.68 times faster than the corresponding serial implementation on CPU.
Category: Data Structures and Algorithms
[448] viXra:2404.0121 [pdf] submitted on 2024-04-25 20:45:49
Authors: Shreyansh Vyas
Comments: 10 Pages. In Russian (Correction made by viXra Admin - Please cite and list scientific references)
This paper presents a comprehensive review of existing TSP (Travelling Salesman Problem) algorithms, followed by the introduction of a novel algorithm designed to efficiently provide optimal solutions within polynomial time. The proposed algorithm not only yields a lower bound on the TSP weight but also guarantees optimality under specific conditions. Through rigorous analysis and experimentation, this research contributes to advancing the field of TSP optimization by offering a reliable and scalable solution.
Category: Data Structures and Algorithms
[447] viXra:2404.0074 [pdf] submitted on 2024-04-15 23:13:47
Authors: Yuly Shipilevsky
Comments: 5 Pages.
We transform NP-complete Problem to the polynomial- time algorithm which would mean that P = NP.
Category: Data Structures and Algorithms
[446] viXra:2403.0122 [pdf] submitted on 2024-03-25 06:33:09
Authors: Priyanshi Thakkar, Nishant Doshi
Comments: 14 Pages.
In the realm of cybersecurity, the preservation of privacy in data analysis processes is of paramount importance. This paper explores the application of privacy-preserving techniques, particularly focusing on the pivotal role of differential privacy. Differential privacy offers a rigorous mathematical framework to quantify privacy guarantees, ensuring that data analysis outcomes do not compromise individual privacy. Amidst escalating concerns surrounding privacy breaches and data vulnerabilities, the adoption of robust privacy-preserving measures becomes imperative. Through an extensive literature review, this paper delves into the theoretical foundations of differential privacy, evaluates its effectiveness in practical applications, and outlines future research directions. By elucidating the potential of this technique, the paper aims to contribute to the advancement of privacy-preserving practices and bolster the overall security posture in the cybersecurity landscape.
Category: Data Structures and Algorithms
[445] viXra:2403.0121 [pdf] submitted on 2024-03-25 06:34:03
Authors: Priyanshi Thakkar, Nishant Doshi
Comments: 9 Pages.
The Internet of Things (IoT) represents a transformative paradigm shift in technology, enabling the seamless integration of physical objects into the digital realm through internet connectivity. This paper explores the foundational components of IoT, including devices, sensors, and connectivity infrastructure, and examines its wide-ranging applications across industries such as healthcare, smart cities, and manufacturing. While IoT promises unparalleled automation and efficiency, the paramount importance of cybersecurity cannot be overstated. As IoT devices proliferate, the need to safeguard data integrity, confidentiality, and availability becomes increasingly critical. Robust cybersecurity measures are essential to protect against potential threats and vulnerabilities that could compromise sensitive information or grant unauthorized access to interconnected devices. This paper underscores the imperative of prioritizing cybersecurity in IoT deployment to ensure the reliability and trustworthiness of interconnected systems in our increasingly digitized world.
Category: Data Structures and Algorithms
[444] viXra:2403.0110 [pdf] submitted on 2024-03-22 07:40:12
Authors: Sin Chol-Nam, Ri Kuk-Jin, Cha Hung-Dok
Comments: 14 Pages.
Certificate verification and revocation are important aspects of public-key infrastructures (PKI). This paper presents the Tree-List Certificate Verification (TLCV) scheme, which uses a novel tree-list structure to provide efficient digital certificate verification. Under this scheme, users in a public-key infrastructure are partitioned into clusters and a separate blacklist of revoked certificates is maintained for each cluster. The verification proof for each cluster’s blacklist comes in the form of a hash path and a digital signature. An algorithm to derive an optimal number of clusters that minimizes the TLCV response size was described. The characteristics of TLCV were examined. Simulations were carried out to compare TLCV against a few other schemes and the performance metrics that were examined include computational overhead, network bandwidth, overall user delay and storage overhead. In general, we find that TLCV performs relatively well against the other schemes in most aspects.
Category: Data Structures and Algorithms
[443] viXra:2403.0001 [pdf] submitted on 2024-03-01 22:20:43
Authors: Romain Viguier
Comments: 18 Pages. (Note by viXra Admin: Please cite scientific references)
This article involves many things (hologram, computation, time, fields etc..) with the leading object being called Unifying Circuit.
Category: Data Structures and Algorithms
[442] viXra:2402.0156 [pdf] submitted on 2024-02-27 20:48:26
Authors: Pranshu Mishra, Surbhi Agarwal
Comments: 4 Pages. Performance Engineering, System Design, CC BY-NC-SA 4.0
The paper presents the design and implementation of performance testing software designed forcloud-based enterprise applications. With the increasing adoption of cloud computing in enterprise environments, ensuring the performance and scalability of these cloud-based applications is crucial. Traditional performance testing tools may not fully address the unique challenges posed by cloud environments as these tools are, generally speaking, usually designed as load generators. Therefore, we propose a comprehensive approach to designing performance testing software specifically for cloud-based enterprise applications in which load generation is one component of it’s architecture.
Category: Data Structures and Algorithms
[441] viXra:2402.0146 [pdf] submitted on 2024-02-26 04:55:22
Authors: Hui Wang
Comments: 8 Pages.
Today modern web site accelerated by scripts1, but the foundation, web page its self is still a static structure. Document Object Model (DOM) represents the structure of web page. Here we show a new approach: It is possible to put timetree and DOM together to shape a new structure named Time Object Model. TOM represents not only a static page but also a dynamic stream. We believe the best way for using TOM is to embed it into a HTML5 page in real time without changing the existence, it is the only way works now.
Category: Data Structures and Algorithms
[440] viXra:2402.0125 [pdf] submitted on 2024-02-22 06:15:16
Authors: JongChol Ri, HyonSu Kim
Comments: 12 Pages.
In this paper, we consider the method that detects the top area of a mountain to decrease the research area in the image processing system.The curved surface of image brightness (CSIV) on ground background forms in a woods, a rock , a crookedness of the ground and so on . The curved surface of image brightness (CSIV) on sky background generally forms in the relatively very great clouds group.Therefore, the curved surface of image brightness (CSIV) on sky background generally forms in great patterns, the curved surface of image brightness (CSIV) on ground background relatively forms in small patterns.On the basis of consideration about the fractal characteristics between the sky and ground background, the top area of a mountain where the appearance probability is very great was detected by using calculation of fractal dimension between sky and ground.
Category: Data Structures and Algorithms
[439] viXra:2402.0012 [pdf] submitted on 2024-02-03 22:10:52
Authors: Yemao Xiao
Comments: 35 Pages.
With the development of blockchain technology and the rise of non-homogeneous tokens (NFTs), this study proposes an innovative blockchain architecture aiming to improve the data processing flexibility of blockchains by using NFTs as core data storage and management elements. In this architecture, NFT not only acts as the chain state basic unit, but also plays the role of a node in the graph structure maintained by the blockchain, which optimizes the efficiency of data organization and retrieval. In addition, smart contracts, account information, and other blockchain data are abstracted into NFT structures to enhance the security and transparency of the system. This study adopts a hybrid methodology for design and implementation, which first elaborates the NFT-based blockchain design principles and data structures, and then implements smart contracts through real-world cases to verify their usability. The experimental results show that the design not only improves the credibility of data retrieval results, but also enhances the scalability of the blockchain system. This study provides new perspectives and solutions for blockchain technology in dealing with complex data structures and optimizing systematic data retrieval, which is of great significance for the future application and research of dedicated blockchain and NFT.
Category: Data Structures and Algorithms
[438] viXra:2402.0011 [pdf] submitted on 2024-02-03 15:02:52
Authors: Yunpeng Qi
Comments: 16 Pages.
The aim of this thesis is to propose a reliability enhancement method for blockchain-based traceability system for large-value commodities, and to design the data structure as NFT (non-homogenized tokens), which highlights the uniqueness and uniqueness of the NFT in order to improve the trustworthiness and reliability of the system. Through a combination of user-submitted proposals, evaluation platforms, and proof monitoring centers, the system realizes the traceability and supervision of large-value commodities and ensures the uniqueness and anti-counterfeiting of each commodity using the NFT data structure. The system utilizes the decentralized nature of blockchain and the automated execution of smart contracts to improve the transparency, security, and credibility of traceability for large-value commodities.
Category: Data Structures and Algorithms
[437] viXra:2401.0124 [pdf] submitted on 2024-01-24 12:42:56
Authors: Hui Wang
Comments: 12 Pages.
We always use timeline to describe time, but timeline unable to describe dynamic time. Because dynamic time is on changing, so timeline must be changing too, but it is impossible. Timeline includes layers, relationships existed between layers. If we changed a layer, the others need to be fixed too. Timeline can not make it up automatically, human being must take the work, therefore real-time-changing becomes impossible. It is no matter for film making, but internet needs instant responding, timeline can not cover it. Here we show a new structure called timetree, it is an auto-balanced hierarch structure. Its structure always be complete during changing without help from human being. It is a challenge for making dynamic interactive contents on internet, timetree is born for it. We have tried timeline before now it is the turn of timetree.
Category: Data Structures and Algorithms
[436] viXra:2401.0063 [pdf] submitted on 2024-01-13 21:00:28
Authors: Warren D. Smith
Comments: 17 Pages.
Various methods have been devised to multiply N×N matrices using O(NE) arithmetic operations when N→∞, for various exponents E with 2<E≤3.However, in practice, the schemes with least known E are not the best, because they only start to win for infeasibly large N. Prior literature has unhealthily been fixated on E alone, i.e. on the asymptotic large-N performance alone. To address that, we propose investigating, for each method, not merely its E, but also its "breakeven N," meaning the least N causing that method to use fewer than the obvious algorithm's N3 bilinear multiplications [or fewer than Strassen'sO(N2.807355) scheme]. The set of (E,logN) datapoints then form a subset of the infinite rectangle (2,3]×[0,∞). Part of that rectangle is filled with datapoints, while another part contains none. What is of interest is the curve delineating the boundary between those two regions.
Category: Data Structures and Algorithms
[435] viXra:2312.0116 [pdf] submitted on 2023-12-21 05:33:46
Authors: Gi-Yoon Jeon
Comments: 5 Pages.
The authentication is the process of determining whether someone or something is, and there are many authentication methods for digital environment. The digital authentication is divided into three main categories, 'What you have', 'What you know', and 'Who you are'. Furthermore, there are multi-factor authentications using a combination of two or more of these. However, these methods are always exposed to the risk of forgery, tampering, and stealing. This paper proposes a novel authentication architecture that is suitable for Internet of Things (IoT) and Internet of Behaviors (IoB) environment. In the aspect of technology, the proposed architecture is token based authentication method. However, this architecture is continuous, mimics real analog world, and has the advantage of being immediately recognizable in counterfeiting.
Category: Data Structures and Algorithms
[434] viXra:2312.0019 [pdf] submitted on 2023-12-03 21:08:41
Authors: Hua Li, Lu Zhang, Ruoxi Guo, Zushang Xiao, Rui Guo
Comments: 11 Pages. 6 figures
This paper introduces a watertight technique to deal with the boundary representation of surface-surface intersection in CAD. Surfaces play an important role in today’s geometric design. The mathematical model of non-uniform rational B-spline surfaces (NURBS) is the mainstream and ISO standard. In the situation of surface-surface intersection, things are a little complicated, for some parts of surfaces may be cut-off, so called trimmed surfaces occur, which is the central topic in the past decades in CAD community of both academia and industry. The main problem is that the parametric domain of the trimmed surface generally is not the standard square or rectangle, and rather, typically, bounded by curves, based on point inverse of the intersection points and interpolated. The existence of gaps or overlaps at the intersection boundary makes hard the preprocessing of CAE and other downstream applications. The NURBS are in this case hard to keep a closed form. In common, a special data structure of intersection curves must be affiliated to support downstream applications, while the data structure of the whole CAD system is not unified, and the calculation is not efficient. In terms of Bezier surface, a special case of NURBS, this paper designs a reparameterization or normalization to transform the trimmed surface into a group of Bezier surface patches in standard parametric domain [0,1]X[0,1]. And then the boundary curve of normalized Bezier surface patch can be replaced by the intersection curve to realize watertight along the boundary. In this way, the trimmed surface is wiped out, the "gap" between CAD and CAE is closed.
Category: Data Structures and Algorithms
[433] viXra:2309.0146 [pdf] submitted on 2023-09-29 12:49:24
Authors: Geraldine Reichard
Comments: 21 Pages.
In this paper, an algorithm is presented, solving the 3-SAT problem in a polynomial runtime of O(n^3), which implies P=NP. The 3-SAT problem is about determining, whether or not a logical expression, consisting of clauses with up to 3 literals connected by OR-expressions, that are interconnected by AND-expressions, is satisfiable. For the solution a new data structure, the 3-SAT graph, is introduced. It groups the clauses from a 3-SAT expression into coalitions, that contain all clauses with literals consisting of the same variables. The nodes of the graph represent the variables connecting the corresponding coalitions. An algorithm R will be introduced, that identifies relations between clauses by transmitting markers, called upgrades, through the graph, making use of implications. The algorithm will start sequentially for every variable and create start upgrades, one for the variables negated and one for its non-negated literals. It will be shown, that the start upgrades have to be within a specific clause pattern, called edge pattern, to mark a beginning or ending of an unsatisfiable sequence. The algorithm will eventually identify other kinds of pattern within the upgraded clauses. Depending on the pattern, the algorithm either sends the upgrades on through the graph, creates new following upgrades to extend the upgrade path, a subgraph storing all previous upgrades, or if all connector literals of a pattern have received upgrades of the same path or two corresponding following upgrades circle, marks the upgrade as circling. If an upgrade circles, then it is unsatisfiable on its path. It will be proven, that if after several execution steps of algorithm R, two corresponding start upgrades circle, then the expression is unsatisfiable and if no upgrade steps are possible anymore and the algorithm did not return unsatisfiable, the expression is satisfiable. The algorithm R is similar to already existing solutions solving 2-SAT polynomial, also making use of implications using a graph.
Category: Data Structures and Algorithms
[432] viXra:2308.0141 [pdf] submitted on 2023-08-21 22:19:56
Authors: B. Sinchev, A. B. Sinchev, A. М. Mukhanova
Comments: 7 Pages.
Given a set of distinct non-negative integers X^n and a target certificate S parametrized in: ∃X^k⊆X^n,∑_(x_i∈X^k)[]x_i =S (k=|X^k |,n=|X^n |). We present a polynomial solution of the subset sum problem with time complexity T≤O(n^2) and space complexity S≤O(n^2 ), so that P = NP.
Category: Data Structures and Algorithms
[431] viXra:2308.0047 [pdf] submitted on 2023-08-09 22:25:25
Authors: Sunny Daniels
Comments: 4 Pages. "Complexity Theory" would almost certainly be the best arXiv category for this paper, but there doesn't seem to be any corresponding viXra category.
I appreciate the feedback from Fortnow (the author of [1]) on my recent viXra article [2], and inparticular him supplying me with a definition of the standard polynomial-time universal Turingmachine[4]. In [3], I clarified the way in which my construction and proofs in [2] could be re-worked to be analogous to the more usual definition of a universal Sigma_2^P Turing machine.Shortly after publishing [2], I realised that my construction in [2] or [3] gives rise to a very, verysimple new constructive proof of Kannan’s theorem: much simpler than anything that has beenpublished before, if I am not mistaken, and even simpler than my proposed "layer by layer" proof in[2]. I suspect that Fornow realised this soon after reading [2]: his e-mail correspondence with meseems to hint at this. If so, I appreciate him giving me the opportunity to pubish this new proofmyself!I present a sketch of this new proof here.
Category: Data Structures and Algorithms
[430] viXra:2308.0031 [pdf] submitted on 2023-08-06 18:46:31
Authors: Sunny Daniels
Comments: 6 Pages. "Complexity Theory" would almost certainly be the best arXiv category for this paper, but there doesn't seem to be any corresponding viXra category.
I appreciate the feedback from Fortnow (the author of [1]) on my recent viXra article [2], and inparticular him supplying me with a definition of the standard polynomial-time universal Turingmachine. In this article I re-work my previous definitions and proofs to put in the circuit sizeexponent "k" explicitly (rather than the pseudo-variable "99" in [2]) and use more standardconventions for the derivation of my construction from the standard polynomial-time universalTuring machine construction.
Category: Data Structures and Algorithms
[429] viXra:2307.0148 [pdf] submitted on 2023-07-27 12:41:11
Authors: Matthew Stephenson
Comments: 11 Pages.
The core reasoning task for datalog engines is materialization, the evaluation of a datalog program over a database alongside its physical incorporation into the database itself. The de-facto method of computing it, is through the recursive application of inference rules. Due to it being a costly operation, it is a must for datalog engines to provide incremental materialization, that is, to adjustthe computation to new data, instead of restarting from scratch. One of the major caveats, is that deleting data is notoriously more involved than adding, since one has to take into account all possible data that has been entailed from what is being deleted. DifferentialDataflow is a computational model that provides efficient incremental maintenance, notoriously with equal performance between additions and deletions, and work distribution, of iterative dataflows. In this paper we investigate the performance of materialization with three reference datalog implementations, out of which one is built on top of a lightweight relational engine, and the two others are differential-dataflow and non-differential versions of the same rewrite algorithm, with the same optimizations.
Category: Data Structures and Algorithms
[428] viXra:2307.0103 [pdf] submitted on 2023-07-19 09:08:03
Authors: Antonio Viggiano
Comments: 5 Pages. DeFi Security Summit, July 15-16, 2023, Paris, France
This study presents a comparative analysis of randomized testing algorithms, commonly known as fuzzers, with a specific emphasis on their effectiveness in catching bugs in Solidity smart contracts. We employ the non-parametric Mann-Whitney U-test to gauge performance, defined as the ``time to break invariants per mutant'', using altered versions of the widely-forked Uniswap v2 protocol. We conduct 30 tests, each with a maximum duration of 24 hours or 4,294,967,295 runs, and evaluate the speed at which the fuzzers Foundry and Echidna can breach any of the 22 protocol's invariant properties for each of the 12 mutants, created both with mutation testing tools and with manual bug injection methods. The research shows significant performance variabilities between runs for both Foundry and Echidna depending on the instances of mutated code. Our analysis indicates that Foundry was able to break invariants faster in 9 out of 12 tests, while Echidna in 1 out of 12 tests, and in the remaining 2 tests, the difference in performance between the two fuzzers was not statistically significant. The paper concludes by emphasizing the necessity for further research to incorporate additional fuzzers and real-world bugs, and paves ground for further developments of more precise and rigorous evaluations of fuzzer effectiveness.
Category: Data Structures and Algorithms
[427] viXra:2307.0088 [pdf] submitted on 2023-07-17 15:06:38
Authors: Mirzakhmet Syzdykov
Comments: 2 Pages.
We present the continuation of studying Extended Regular Expression (ERE) on the view ofmodified subset construction within the overridden operators like intersection, subtraction, and re-written complement. As before we have stated that in this case the complexity has a decreasing nature and tendency. We will give the strict definition of the operational part of this modified subset construction which is due to Rabin and Scott. The complexity of algorithm remains a magnitude less than NP-hard problems for which we have given the strict proof of equivalence in the prior work, so this work continues the studying of the comparable proof for a variety of problems to be computationally complex, however, explainable in terms of unified approach like operational calculus. In this calculus the general points of research are given to the representation of modified subset construction with at least two operands which are to be computed by subsetconstruction and in terms of complexity of the effective algorithm they are computed using modified subset construction.
Category: Data Structures and Algorithms
[426] viXra:2307.0061 [pdf] submitted on 2023-07-12 17:31:17
Authors: Mirzakhmet Syzdykov
Comments: 1 Page.
We present a polynomial time algorithm for producing the final graph where the minimal Hamiltonian cycle exists. Thus, proving that P = NP.
Category: Data Structures and Algorithms
[425] viXra:2307.0031 [pdf] submitted on 2023-07-07 03:24:16
Authors: Sunny Daniels
Comments: 5 Pages. "Complexity Theory" would almost certainly be the best arXiv category for this paper, but there doesn't seem to be any corresponding viXra category.
I appreciate the acknowledgement by Fortnow[1] of my, historically significant, I believe, first ever constructive proof of Kannan’s theorem[2], which I think was acknowledged as "folklore" (I thank Watanabe for this) in Watanabe[3] (I don’t think I currently have access to any university library system so I don’t think I currently have access to the full text of this). If I am not mistaken, I now have a much simpler construction (I thought of this years ago if I remember rightly but was a bit too busy with an IT job unrelated to complexity theory until recently to have the time to publish it) giving a constructive proof of Kannan’s theorem, based upon a particular type of universal Turing machine. The fact that this construction (if I am not mistaken) works is an easy consequence of either my earlier constructive proof[2] or Watanabe’s later proof of the same result as discussed by Fortnow[1]. (I know that my presentations here of this construction and proof are quite sketchy at present; I would appreciate some feedback from others before attempting to write up more detailed versions of them). Also, I think it can be used as the basis of a much simpler constructive proof of Kannan’s theorem: I also discuss this here.
Category: Data Structures and Algorithms
[424] viXra:2306.0123 [pdf] submitted on 2023-06-21 22:44:29
Authors: Mesut Kavak
Comments: 4 Pages. in Turkish
Bir süredir evreni yöneten temel kanunlar üzerine çalışıyorum. Görünen o ki; herhangi bir fiziksel olguya neden olan en temel ve etkileyici ilke; Heisenberg'in Belirsizlik İlkesi, varlığın herhangi bir özelliğinin belirsizlik nedeniyle var olduğudur. Bu süreçte bilginin korunması üzerine düşünürken şunu fark ettim ki; bilgi, asla kaybolamaz; ama bir noktada alternatifi olmadığı için bize göre tamamen tanınmaz hale geliyor. Her türlü bilgi ve aranan bilgi bize göre bir noktadan sonra aynı hale geliyor. Duyarlılık sonsuza kadar artar ama kaybolmaz. Her hassasiyet seviyesi ayrıca daha yüksek seviyeye sahiptir; ki yani aslında mutlak bir koruma mümkün görünüyor. "Peki böyle bir koruma, donanımsal olmayan bir kuantum sistem dışında da örneğin kuantum tünelleme ile öngürülemez olarak bilgi korunumunu amaçlayan bir kuantum etkisi yakalayarak mümkün müdür?"
I've been working on the fundamental laws that govern the universe for some time now. Apparently, the most fundamental and impressive principle that causes any physical phenomenon; Heisenberg's Uncertainty Principle is that any property of being exists due to uncertainty. While thinking about the protection of information in this process, I realized that; knowledge can never be lost; but at some point it becomes completely unrecognizable to us because there is no alternative. For us, all kinds of information and sought-after information become the same after a point. Sensitivity increases forever but does not disappear. Each sensitivity level also has a higher level; So in fact, absolute protection seems possible. "But is such protection possible outside of a non-hardware quantum system, for example, by catching a quantum effect, which is unpredictably aimed at information preservation, by quantum tunneling?"
Category: Data Structures and Algorithms
[423] viXra:2306.0119 [pdf] submitted on 2023-06-20 16:26:07
Authors: Ed Andersen
Comments: 2 Pages.
This document intends to emphasize some aspects of a recent algorithm capable of generating a secret key by transmitting information over a public channel. Given that the scheme’s construction is engaging and represents a topical innovation, we deem it useful to refer to it as "The Fabbrini Problem", after its author.
Category: Data Structures and Algorithms
[422] viXra:2305.0088 [pdf] submitted on 2023-05-12 00:38:46
Authors: Farid Soroush
Comments: 4 Pages.
The purpose of this report is to describe the design and implementation of a real-time portfolio risk management system. The system is developed in Python and utilizes pandas and numpy libraries for data management and calculations. With the advent of high-frequency trading, risk management has become a crucial aspect of the trading process. Traditional risk management practices are often not suitable due to the high-speed nature of these trades. Therefore, there is a need for a real-time risk management system that can keep pace with high-frequency trades.
Category: Data Structures and Algorithms
[421] viXra:2305.0087 [pdf] submitted on 2023-05-10 06:04:59
Authors: Herman Schoenfeld
Comments: 22 Pages.
This paper presents a formal construction of dynamic merkle-trees and a deep-dive into theirmathematical structure. In doing so, new and interesting artefacts are presented as well as novel security proof constructions that enable proofs for a full range of tree transformations including append, update and deletion of leaf nodes (without requiring knowledge of those nodes). Novel concepts are explored including "perfect trees", "sub-trees", "sub-roots" and "flat coordinates" through various lemmas, theorems and algorithms. In particular, a "flat-tree" implementation of merkle-trees is presented suitable for storing trees as a contiguous block of memory that can grow and shrink from the right-side. Of note, a "long-tree" implementation is presented which permits arbitrarily large tree construction in logarithmic space and time complexity using a novel algorithm. Finally, a reference implementation accompanies this paper which contains a fully implemented and thoroughly tested implementation of dynamic merkle-trees.
Category: Data Structures and Algorithms
[420] viXra:2305.0062 [pdf] submitted on 2023-05-07 22:23:14
Authors: Farid Soroush
Comments: 5 Pages.
Market making is a crucial component of financial markets, providing liquidity to market participants by quoting both buy (bid) and sell (ask) prices for an asset. The main objective of a market maker is to profit from the bid-ask spread while managing inventory risk. In this paper, we implement a simple market making strategy for the S&P 500 index using synthetic bid-ask spread data.
Category: Data Structures and Algorithms
[419] viXra:2305.0033 [pdf] submitted on 2023-05-04 17:01:34
Authors: Petar Radanliev
Comments: 25 Pages.
The first cryptocurrency was invented in 2008/09, but the Blockchain-Web3 concept is still in its infancy, and the cyber risk is constantly changing. Cybersecurity should also be adapting to these changes to ensure security of personal data and continu- ation of operations. This article starts with a comparison of existing cybersecurity standards and regulations from the National Institute of Standards and Technology (NIST) and the International Organisation for Standardisation (ISO)—ISO27001, followed by a discussion on more specific and recent standards and regulations, such as the Markets in Crypto-Assets Regulation (MiCA), Committee on Payments and Market Infrastructures and the International Organisation of Securities Commis- sions (CPMI-IOSCO), and more general cryptography (and post-quantum cryptog- raphy), in the context of cybersecurity. These topics are followed up by a review of recent technical reports on cyber risk/security and a discussion on cloud security questions. Comparison of Blockchain cyber risk is also performed on the recent EU standards on cyber security, including European Cybersecurity Certification Scheme (EUCS)—cloud, and US standards—The National Vulnerability Database (NVD) Common Vulnerability Scoring System (CVSS). The study includes a review of Blockchain endpoint security, and new technologies e.g., IoT. The research meth- odology applied is a review and case study analysing secondary data on cyberse- curity. The research significance is the integration of knowledge from the United States (US), the European Union (EU), the United Kingdom (UK), and international standards and frameworks on cybersecurity that can be alighted to new Blockchain projects. The results show that cybersecurity standards are not designed in close cooperation between the two major western blocks: US and EU. In addition, while the US is still leading in this area, the security standards for cryptocurrencies, inter- net-of-things, and blockchain technologies have not evolved as fast as the technolo- gies have. The key finding from this study is that although the crypto-market has grown into a multi-trillion industry, the crypto-market has also lost over 70% since its peak, causing significant financial loss for individuals and cooperation’s. Despite this significant impact to individuals and society, cybersecurity standards and finan- cial governance regulations are still in their infancy, specifically in the UK.
Category: Data Structures and Algorithms
[418] viXra:2304.0164 [pdf] submitted on 2023-04-20 09:23:37
Authors: Jouni Puuronen
Comments: 12 Pages.
We find a container algorithm that is based on performing small translations of arrays, and that supports insert, search and delete operations with log(n) computational complexity.
Category: Data Structures and Algorithms
[417] viXra:2303.0042 [pdf] submitted on 2023-03-06 20:36:23
Authors: Subrata Pandey
Comments: 7 Pages.
This research paper presents a comparative study of the performance of two optimization algorithms, namely the Firefly Algorithm (FFA) and Fminsearch, for the PID tuning of a DC motor speed control system. The performance of the two algorithms is evaluated based on the time-domain response of the DC motor speed control system. The Integral of Time multiplied by the Absolute value of the Error (ITAE) is used as the objective function here. The results show that the Fminsearch algorithm outperforms the FFA in terms of the rate of convergence, computation time and settling time. The proposed approach provides an effective and efficient way for the design and tuning of PID controllers in DC motor speed control systems. This study contributes to the ongoing efforts in developing more efficient and reliable control systems for DC motors, which are widely used in various industrial applications.
Category: Data Structures and Algorithms
[416] viXra:2303.0001 [pdf] submitted on 2023-03-01 02:05:03
Authors: Jose Manuel Bustarviejo Casado, Verónica Fernández Mármol, Pablo Arteaga-Díaz
Comments: 73 Pages.
This master thesis is related to the error processing stage of a practical quantum communication system.The system allows secure communications between two parties, (Alice and Bob). Initially, Alice has a string of bits that wants to send to Bob in a safe way. In order to do this, Alice encrypts the string of bits using photons that are polarized through applying horizontal, vertical, diagonal left and diagonal right bases. When all photons have been polarized, Alice sends them through a quantum channel to Bob. Once Bob receives them, he applies the same type of bases that Alice uses to encrypt for decrypting but in a random manner. They then compare the bases they used through a public, but authenticated channel, and keep only those in which both use the same bases for encryption and decryption. This string of bits, known as the sifted key, can still contain errors (bits that are different for Alice and Bob) which are produced by the noise of the channel or because Bob has applied a wrong basis in one or more photons. To resolve this, Alice and Bob use a reconciliation protocol which allows to fix the errors of the sifted key that Bob has. In this reconciliation protocol, Bob divides his key in blocks. For each block, Bob calculates the parity and asks Alice for the parity of the corresponding block in Alice's key. When Bob receives Alice’s parity, he compares it with his parity for the current block. If the parties are not equal, there is an error in the current block. In this case, Bob will look for the error in the block and fix it.However, there can be an eavesdropper (Eve) in the communications who can try to steal information. In this case, during quantum communications, Eve can intercept the photons that Alice has sent to Bob and can do the following:1. Decrypt the key that Alice has sent to Bob using the same bases that Alice. Eve has ¼ of probability of success and ¾ probability of failing. If Eve chooses this option, she will modify the state of each photon.2. Save the photons state without measuring anything and waiting to finish the photon transmissions.(Truncated by viXra Admin)
Category: Data Structures and Algorithms
[415] viXra:2302.0099 [pdf] submitted on 2023-02-20 18:45:21
Authors: Shulang Lei
Comments: 148 Pages.
General 3D printers use Fused Deposition Modeling (FDM) and Stereolithography (SLA) technologies to print 3D models. However, turning the nozzle on and off during FDM or SLA extruding will cause unwanted results. This project created an experimental 3D model slicer named Embodier that generates continuous extruding paths whenever possible. This enables 3D printers to draw out the printing layers accurately in a stable manner. Furthermore, the slicer partitions the outlines to tree structures for efficiency and applies flooding algorithm for water-tightness validation. Lastly, a 3D printing simulator is also created to visualize the printed paths in 3D for a more intuitive review of the Embodier slicer. The end result is that we have discovered that not only a single continuous-extruded-path slicer is possible, it can also be optimized for performance and robustness in practice.
Category: Data Structures and Algorithms
[414] viXra:2212.0219 [pdf] submitted on 2022-12-30 08:49:45
Authors: Evgeny Kuznetsov
Comments: 10 Pages. unicode symbol ≠ could be changed to != if needed
Things become different at infinity. A school example - if you count the number of even numbers up to 100, there are half as many of them as all numbers. And at infinity, every natural number corresponds to an even number. And it turns out that there is an equal quantity of each of them. Without limiting the quantity of numbers, it is impossible to mathematically prove that there ar fewer even numbers than all numbers. A similar story is observed in complexity theory. In this paper, using a lazy Turing machine, it is proved that for programs with infinite length, the maximum complexity is O(n). And one of the consequences of this fact is that P = NP at infinity. And because of this, as one of the consequences of the last statement, it is impossible to prove that P ≠ NP without limiting the program’s length. In time hierarchy theorem, diagonalization implicitly limits the program’s length. We need a similar trick to keep progress going.
Category: Data Structures and Algorithms
[413] viXra:2212.0184 [pdf] submitted on 2022-12-26 01:27:36
Authors: Alexey Podorov
Comments: 7 Pages. The structure, grammar and examples of a programming language for teaching are offered
The structure, grammar and examples of a programming language for teaching are offered.
Category: Data Structures and Algorithms
[412] viXra:2212.0144 [pdf] submitted on 2022-12-19 02:17:28
Authors: Gilbert Krougman, Pasoul Rozhier, Kawabata Yakurami
Comments: 6 Pages.
Consider non-cooperative pen games where both players act strategically and heavily influence each other. In spam and malware detection, players exploit randomization to obfuscate malicious data and increase their chances of evading detection at test time. The result shows Pen-PL Games have a probability distribution that approximates a Gaussian distribution according to some probability distribution defined over the respective strategy set. With quadratic cost functions and multivariate Gaussian processes, evolving according to first order auto-regressive models, we show that Pen-PL "smooth" curve signaling rules are optimal. Finally, we show that computing a socially optimal Pen-PL network placement is NP-hard and that this result holds for all P-PL-G distributions.
Category: Data Structures and Algorithms
[411] viXra:2211.0150 [pdf] submitted on 2022-11-25 23:48:50
Authors: Stephane H. Maes
Comments: 8 Pages.
Over the last year, several vendors and evangelist have taken on to promote, and push, super clouds, also known as meta clouds, as the future of cloud computing, or at least a latest trends.It is not clear that it is an actual trend yet, or just the pe project of a few. Part of the problems that we see with super clouds may come from shifting all-encompassing definitions, which can mean whatever you want them to mean.In any case we argue that such a super cloud, as an abstract layer across clouds, may be a problematic approach, from a business viability, architecture and efficiency point of view. Using management tools and practices for multi cloud, hybrid cloud and developing cloud native applications is a better approach, with more manageable challenges.
Category: Data Structures and Algorithms
[410] viXra:2211.0097 [pdf] submitted on 2022-11-17 03:22:16
Authors: Sing Kuang Tan
Comments: 12 Pages.
In this paper, I am going to propose a new programming language in mathematical algebraic form. Represent all algorithms in canonical form, that is easy to read, analyze and communicate with other people. Loop invariant, preconditions, post conditions are difficult to use. It can use to derive all properties of algorithms. It can be fed into computer to analyze and manipulate symbolically. Analyze an algorithm sequentially cannot see global pattern in the algorithms, and this is not a long term solution.
Category: Data Structures and Algorithms
[409] viXra:2211.0085 [pdf] submitted on 2022-11-14 03:13:23
Authors: Alex-Pauline Poudade
Comments: 7 Pages.
This paper discusses a new way to encrypt data using transcendental properties discovered through mathematical Bailey—Borwein—Plouffe formula (BBP). In regard to pseudo-stochastic computer methods, it enables a stronger non-linear model close to True number generators (TRNG) resistance without need for physical prior transmissions of initial stochastic patterns. Soleau envelope European deposit number: DSO2017001085 - deposit reference 260819711812005332017 National Institute of Industrial Property (INPI) February 2, 2017 Supplemental Code/Data: doi.org/10.7910/DVN/CCJMAP
Category: Data Structures and Algorithms
[408] viXra:2211.0072 [pdf] submitted on 2022-11-11 10:17:34
Authors: Mirzakhmet Syzdykov
Comments: 1 Page.
In this work the experimental results along with proof are presented: the state explosion doesn’t occur in specific cases after decomposition of regular expression into non-deterministic finite automata (NFA), thus, the P-complete procedure to take turn for converting NFA into deterministic finite automaton (DFA) with respect to the De Morgan Law
Category: Data Structures and Algorithms
[407] viXra:2211.0058 [pdf] submitted on 2022-11-11 01:53:46
Authors: Mirzakhmet Syzdykov
Comments: 4 Pages.
We present the difference of the "P versus NP" problem for polynomial and non-polynomial classes on the example of two contest problems held by ACM ICPC NEERC in 2004 and 2005.
Category: Data Structures and Algorithms
[406] viXra:2209.0128 [pdf] submitted on 2022-09-22 05:21:39
Authors: Kola Sampangi Sambaiah
Comments: 38 Pages. I declare the research article nowhere submitted or published completely or partially.
Capacitor allocation plays a vital role in the planning and operation of distribution networks. Optimal allocation of capacitor provides reactive power compensation, relieves feeder capacity, improves voltage profile, minimizes power losses, annual cost, and maximizes net savings. However, optimal capacitor allocation is a complex combinatorial optimization problem which consists of finding the bus location, number of capacitors to be installed and their respective sizes by satisfying all the distribution network constraints. The present article investigates the effective implementation of two novel metaheuristic algorithms for solving capacitor allocation optimization problems in the radial distribution network (RDN). The first algorithm is inspired by the water cycle process of nature in the real world where streams and rivers flow to the sea known as the water cycle algorithm (WCA). The second algorithm is inspired by salp swarming behavior in oceans for navigating and foraging is known as the salp swarm algorithm (SSA). To crisscross the feasibility, WCA and SSA are tested on standard 9, 33, 34, 69, and 85 — bus RDNs. Both the algorithms require less computational time for evaluating the objective function and only a few parameters need to be tuned. In addition, to show the superiority of the results obtained by WCA and SSA comparison has been made with various existing optimization techniques. The comparison confirms that both algorithms are more effective in minimizing power losses and operating costs and well suitable for solving capacitor allocation problems in RDNs.
Category: Data Structures and Algorithms
[405] viXra:2208.0034 [pdf] submitted on 2022-08-07 22:37:53
Authors: Mirzakhmet Syzdykov
Comments: 3 Pages.
We prove the nonequivalence of P and NP classes via comparison of the speed of growth of polynomialand non-polynomial functions and approximated limit for both of them.
Category: Data Structures and Algorithms
[404] viXra:2207.0150 [pdf] submitted on 2022-07-25 06:15:31
Authors: Sanjeev Saxena
Comments: 6 Pages.
This note describes a very simple O(1) query time algorithm for finding level ancestors. This is basically aserial (re)-implementation of the parallel algorithm. Earlier, Menghani and Matani described another simple algorithm; however, their algorithm takesO(log n) time to answer queries. Although the basic algorithm has preprocessing time of O(n log n), by having additional levels, thepreprocessing time can be reduced to almost linear or linear.
Category: Data Structures and Algorithms
[403] viXra:2207.0143 [pdf] submitted on 2022-07-25 00:32:44
Authors: Injung Kim, Yujin Lee, Seungjun Park
Comments: 6 Pages. In Korean. CC BY.
Ooho is a kind of water bottle that surrounds and stores water with a membrane of calcium alginate, a biodegradable material. ooho was expected to replace plastic water bottles, and it was expected to be commercialized and have a positive impact on the environment. However, it is difficult to say that it has been popularized as it has only been used at some events as some fatal shortcomings have hindered its commercialization.Therefore, in this study, the reason why ooho is difficult to be used as a general water bottle will be specifically and clearly identified through various experiments. In addition, a complementary method will be derived by analyzing the results obtained through the experiment through machine learning techniques. Finally, through the above results, we would like to propose a scientific complementary method that can result in commercialization of ooho.
Category: Data Structures and Algorithms
[402] viXra:2207.0049 [pdf] submitted on 2022-07-06 19:05:22
Authors: Stephane H. Maes
Comments: 19 Pages.
As businesses and business trends continue to evolve so do the tools of each line of business within the organization. In this paper, we explore the current evolutions of ITIL practices, ITSM, ITOM, and ESM algorithsm and tools and DevOps, ERP, and other enterprise and industry applications, as well as how these changes are bringing organizations closer to digital transformation.
The paper provides architecture and lessons learned and recommendations on how to best combine these algorithms, tools and practices while keeping the “separation of concerns” at the forefront.
The paper discusses how digital transformation and ITIL will take organizations on a path of business maturity with omnichannel self-service and automation, as well as how they can help grow the business faster while maintaining employee retention.
Category: Data Structures and Algorithms
[401] viXra:2207.0022 [pdf] submitted on 2022-07-04 22:44:50
Authors: Yuly Shipilevsky
Comments: 6 Pages.
We reduce finding of Least Common Multiple of integer numbers to polynomial-time integer optimization problems and to NP-hard integer optimization problems.
Category: Data Structures and Algorithms
[400] viXra:2206.0100 [pdf] submitted on 2022-06-19 21:59:09
Authors: Jabari Zakiya
Comments: 37 Pages.
This paper explains in detail the math and software comprising the implementation of a fast and efficient Segmented Sieve of Zakiya (SSOZ) to count the number Twin and Cousin Primes within a 64-bit interval, and provide the largest value. Six programming languages implementations are provided, with benchmarks run on 8 and 16 thread systems. The paper provides the details to code it in any language of choice, using the given coded versions as reference implementations.
Category: Data Structures and Algorithms
[399] viXra:2206.0062 [pdf] submitted on 2022-06-13 21:34:22
Authors: Yuly Shipilevsky
Comments: 4 Pages.
We reduce finding of Least Common Multiplier of two
integer numbers to polynomial-time integer optimization problem and to NP-hard integer optimization problem that would imply P = NP.
Category: Data Structures and Algorithms
[398] viXra:2205.0039 [pdf] submitted on 2022-05-07 12:57:23
Authors: Arash Vaezi, Sara Azarnoush, Parsa Mohammadian
Comments: 37 Pages.
The objective of any security system is the capacity to keep a secret. It is vital to keep the data secret when it is stored as well as when it is sent over a network. Nowadays, many people utilize the internet to access various resources, and several businesses employ a dispersed environment to give services to their users. As a result, a more secure distributed environment is required, in which all transactions and processes can be effectively completed safely. It is critical in a distributed system environment to deliver reliable services to users at any time and from any place. As an example of a distributed system, Blockchain is a unique distributed system that has confronted lots of attacks despite its security mechanism. Security is a top priority in a distributed setting. This paper organizes many attacks that byzantine users may apply to take advantage of the loyal users of a system. A wide range of previous articles dealt considered diverse types of attacks. However, we could not find a well-organized document that helps scientists consider different attacking aspects while designing a new distributed system.
A hundred various kinds of most essential attacks are categorized and summarized in this article.
Category: Data Structures and Algorithms
[397] viXra:2205.0004 [pdf] submitted on 2022-05-01 21:39:39
Authors: A. V. Serghienko
Comments: 38 Pages.
The literature on Delphi numbers many manuals. However among them there are few books, oriented to the solution of scientific and technical problems. The work contains the examples of programs in Delphi. Depending on the type of problems it is convenient to use or graphic (usual), or console applications of Delphi. Graphic applications are applicable for the plotting of functions. Console applications are applicable especially, when we needn’t the visualization, and it is necessary to introduce by hand lots of data. One can tell the console applications from the graphic ones with the presence of the next line in the text of program:
{$APPTYPE CONSOLE}
Category: Data Structures and Algorithms
[396] viXra:2204.0168 [pdf] submitted on 2022-04-29 20:25:28
Authors: Amey Thakur, Mega Satish, Randeep Kaur Kahlon, Hasan Rizvi, Ajay Davare
Comments: 8 Pages. 16 figures, Volume 11, Issue 04, INTERNATIONAL JOURNAL OF ENGINEERING RESEARCH & TECHNOLOGY (IJERT), April 2022.
We propose to develop a program that can show a QuadTree view and data model architecture. Nowadays, many digital map applications have the need to present large quantities of precise point data on the map. Such data can be weather information or the population in towns. With the development of the Internet of Things (IoT), we expect such data will grow at a rapid pace. However, visualizing and searching in such a magnitude of data becomes a problem as it takes a huge amount of time. QuadTrees are data structures that are used to efficiently store point data in a two-dimensional environment. Each node in this tree has a maximum of four children. QuadTrees allow us to visualize the data easily and rapidly compared to other data structures. This project aims to build an application for interactively visualizing such data, using a combination of grid-based clustering and hierarchical clustering, along with QuadTree spatial indexing. This application illustrates the simulation of the working of the QuadTree data structure.
Category: Data Structures and Algorithms
[395] viXra:2204.0040 [pdf] submitted on 2022-04-09 20:53:31
Authors: Qui Somnium
Comments: 58 Pages.
We argue that the current Proof of Work based consensus algorithm of the Bitcoin network suffers from a fundamental economic discrepancy between the real-world transaction costs incurred by miners and the wealth that is being transacted. Put simply, whether one transacts 1 satoshi or 1 bitcoin, the same amount of electricity is needed when including this transaction into a block. The notorious Bitcoin blockchain problems such as its high energy usage per transaction or its scalability issues are, either partially or fully, mere consequences of this fundamental economic inconsistency. We propose making the computational cost of securing the transactions proportional to the wealth being transfered, at least temporarily. First, we present a simple incentive based model of Bitcoin's security. Then, guided by this model, we augment each transaction by two parameters, one controlling the time spent securing this transaction and the second determining the fraction of the network used to accomplish this. The current Bitcoin transactions are naturally embedded into this parametrized space. Then we introduce a sequence of hierarchical block structures (HBSs) containing these parametrized transactions. The first of those HBSs exploits only a single degree of freedom of the extended transaction, namely the time investment, but it allows already for transactions with a variable level of trust together with aligned network fees and energy usage. In principle, the last HBS should scale to tens of thousands timely transactions per second while preserving what the previous HBSs achieved. We also propose a simple homotopy based transition mechanism which enables us to relatively safely and continuously introduce new HBSs into the existing blockchain. Our approach is constructive and as rigorous as possible and we attempt to analyze all aspects of these developments, al least at a conceptual level. The process is supported by evaluation on recent transaction data.
Category: Data Structures and Algorithms
[394] viXra:2202.0183 [pdf] submitted on 2022-02-28 22:17:38
Authors: Vishal Pandey, Dhiraj Ojha
Comments: 3 Pages.
In Graph Theory, there is a concept of the matching the graph and the covering of edge I haveto find the relation between both of them however there is no restriction which specifications we have to take either maximum
matching or minimum edge cover and maximum edge coveror minimum matching and any other like perfect matching. I had taken a simple and a adjustable set functions of (maximum and minimum) which will be helpful in the counter partof the calculations. The assumption which we have taken is maximum graph matching and minimal edge cover in the arbitrary graph. After doing all the calculations for this situation which we have taken we got the sum of the maximum matching and minimum edge cover is less than equal to the vertices.
Category: Data Structures and Algorithms
[393] viXra:2202.0017 [pdf] submitted on 2022-02-03 20:07:38
Authors: Amey Thakur, Mega Satish
Comments: 6 pages, 5 figures, Volume 10, Issue I, International Journal for Research in Applied Science and Engineering Technology (IJRASET), 2022. DOI: https://doi.org/10.22214/ijraset.2022.40066
The purpose of this paper is to introduce the Julia programming language with a concentration on Text Summarization. An extractive summarization algorithm is used for summarizing. Julia's evolution and features, as well as comparisons to other programming languages, are briefly discussed. The system's operation is depicted in a flow diagram, which illustrates the processes of sentence selection.
Category: Data Structures and Algorithms
[392] viXra:2201.0155 [pdf] submitted on 2022-01-24 13:33:00
Authors: Antonio Boccuto, Ivan Gerace, Valentina Giorgetti
Comments: 39 Pages.
We deal with the image deblurring problem.
We assume that the blur mask has large dimensions.
To restore the images, we propose a GNC-type technique, in which a convex approximation of the energy function is first minimized. The computational cost of the GNC algorithm
depends strongly on the cost of such a first minimization. So, we propose of approximating the Toeplitz symmetric matrices in the blur operator by means of suitable matrices. Such matrices
are chosen in a class of matrices which can be expressed as a direct sum between a circulant and a reverse circulant matrix.
Category: Data Structures and Algorithms
[391] viXra:2201.0097 [pdf] submitted on 2022-01-16 10:28:30
Authors: Sunny Daniels
Comments: 14 Pages. "Computers and Society" would appear to be the most appropriate arXiv category for this article, but there does not seem to be a corresponding category in viXra.
I have an old computer, probably manufactured in the 1990s, with e-mails from the late Dr Michael
Lennon of Auckland University (passed away in 1999) on its hard drive. I think that these e-mails
might be of significant historical interest to mathematicians (and probably others) because of the
significant contribution that I believe Dr Michael Lennon made (in the 1970s I believe) to the knot
theory breakthrough in the 1980s that resulted in the New Zealand mathematician Sir Vaughan
Jones (who passed away in 2020 I believe) being awarded a Fields Medal in 1990.
Because of the age of this computer and its hard drive, and the possible historical value of these old
e-mails from Dr Michael Lennon, I wish to try to extract these e-mails from this old computer in
such a way as to minimise the probability of triggering any hardware or software failure that could
endanger this old data. I propose a method involving DTMF tones and also an alternative method
involving a data logger attached to the serial port of this old computer.
I would appreciate feedback on these proposed methods from people knowledable in this area
(extraction of valuable data from possibly unreliable old computers) before attempting to go any
further with the extraction process.
Category: Data Structures and Algorithms
[390] viXra:2201.0086 [pdf] submitted on 2022-01-14 22:09:33
Authors: Mason White
Comments: 6 Pages.
In a world that is dominated by speed and instant gratification, it comes as no surprise that the technologies that are necessary to power applications also have to be as fast as possible. Currently, the programming language Python is the ruler of the programming world. Python is being used to program everything from web applications to advanced AI systems. However, in recent years a challenger has emerged that is attempting to dethrone Python. Julia is a new programming language that is touted as a Python killer. Sporting an expressive syntax and a JIT compilation using LLVM it has the speed of C++ with the development feel of Python. Though it is often considered faster than Python there are several areas that most studies ignore such as code styling and general program architecture. As such, this study looks at the speeds of basic functionalities such as for loops, printing basic math equations to the console, and if statements of both Python and Julia to determine just how much faster Julia is. By only studying the listed commands all bias to architecture and coding style can be mitigated or eliminated and a true understanding of how much faster Julia is can be determined.
Category: Data Structures and Algorithms
[389] viXra:2201.0051 [pdf] submitted on 2022-01-10 12:07:02
Authors: Mason White
Comments: 6 Pages.
A good codebase is a well-documented codebase. Many scientific programs are being developed by scientists as opposed to formally trained developers. As such, the need for scientists to know how to properly document code is growing. A well-documented codebase allows scientists to write better, easy to understand code that will ultimately free up resources that can be allocated for more productive tasks such as conducting their research as opposed to spending countless hours trying to modify a codebase that needs to meet new requirements. This document will give a brief tutorial on how to properly write code comments as well as API-level documentation that will make scientific software easier to understand and much more modifiable in the future.
Category: Data Structures and Algorithms
[388] viXra:2201.0008 [pdf] submitted on 2022-01-02 20:23:07
Authors: A. V. Antipin
Comments: 9 Pages. Это статья на РУССКОМ ЯЗЫКЕ.
Для определения периода и фазы колебаний, присутствующих в экспериментальных данных, разработан оригинальный компьютерный алгоритм, использующий метод полного перебора моделей, получаемых Методом Наименьших Квадратов (МНК). Отличительная черта алгоритма – возможность использовать исходные данные с пропусками, с произвольно расположенными по оси Х отсчётами, а также отсутствие необходимости предварительной подготовки данных в силу возможности задавать произвольные модели для МНК.
To determine the period and phase of the oscillations present in the experimental data, an original computer algorithm has been developed using the method of full iteration of models obtained by the Least Squares Method (LSM).
A distinctive feature of the algorithm is the ability to use source data with omissions, with samples arbitrarily arranged along the X axis, as well as the absence of the need for preliminary data preparation due to the possibility of setting arbitrary models for LSM.
Category: Data Structures and Algorithms
[387] viXra:2112.0088 [pdf] submitted on 2021-12-16 03:09:37
Authors: Guanxuan Wu
Comments: 5 Pages.
The thirty-years development of Search Engines could never set apart the fundamental problem: to improve relativity for the page rankings with the given query. As the NLP is now widely used, this paper discusses a data abstraction via knowledge graph (KG) for NLP models that could be applied in relating keywords to entities with a higher probability.
Category: Data Structures and Algorithms
[386] viXra:2110.0002 [pdf] submitted on 2021-10-01 20:14:11
Authors: Yu-Cheng Liu
Comments: 11 Pages.
Sorting algorithm is one of the most important fields in computer science. People learn sort algorithms before learning other advanced algorithms. Since sorting is important, computer scientists study and try to create new
sorting algorithms. The paper is composed of five themed sections: Introduction, Method, Analyzis, Experiment Result, and Conclusion. The first section of this paper will give a brief overview of the algorithm and introduce a new way to sort a list called PT sort, a non-comparison integer sorting algorithm that is based on subtracting the
largest exponent with radix two, then using recursion and traverse on every separated list. Section two and three begins by laying out the theoretical dimensions of the research, proposes the methodology, and analyzes the time complexity of PT sort. The forth section presents the findings of the research, focusing on the result of the experiment. The time complexity and space complexity of PT sort is approximatly ( ∙ log2 ) where n is the number of the numbers being sorted and r is the largest number in the list.
Category: Data Structures and Algorithms
[385] viXra:2109.0113 [pdf] submitted on 2021-09-11 21:00:36
Authors: Arundale Ramanathan
Comments: 17 Pages.
Unishox is a hybrid encoding technique, with which short unicode strings could be compressed using context aware pre-mapped codes and delta coding resulting in surprisingly good ratios.
Category: Data Structures and Algorithms
[384] viXra:2108.0142 [pdf] submitted on 2021-08-26 11:08:09
Authors: Amey Thakur, Mega Satish
Comments: 12 pages, 12 figures, Volume 9, Issue VII, International Journal for Research in Applied Science & Engineering Technology (IJRASET), 2021. DOI: http://dx.doi.org/10.22214/ijraset.2021.36609
The project's main goal is to build an online book store where users can search for and buy books based on title, author, and subject. The chosen books are shown in a tabular style and the customer may buy them online using a credit card. Using this Website, the user may buy a book online rather than going to a bookshop and spending time. Many online bookstores, such as Powell's and Amazon, were created using HTML. We suggest creating a comparable website with .NET and SQL Server. An online book store is a web application that allows customers to purchase ebooks. Through a web browser the customers can search for a book by its title or author, later can add it to the shopping cart and finally purchase using a credit card transaction. The client may sign in using his login credentials, or new clients can simply open an account. Customers must submit their full name, contact details, and shipping address. The user may also provide a review of a book by rating it on a scale of one to five. The books are classified into different types depending on their subject matter, such as software, databases, English, and architecture. Customers can shop online at the Online Book Store Website using a web browser. A client may create an account, sign in, add things to his shopping basket, and buy the product using his credit card information. As opposed to a frequent user, the Administrator has more abilities. He has the ability to add, delete, and edit book details, book categories, and member information, as well as confirm a placed order. This application was created with PHP and web programming languages. The Online Book Store is built using the Master page, data sets, data grids, and user controls.
Category: Data Structures and Algorithms
[383] viXra:2108.0140 [pdf] submitted on 2021-08-25 01:05:28
Authors: Amey Thakur
Comments: 12 pages, 18 figures, Volume 9, Issue VII, International Journal for Research in Applied Science & Engineering Technology (IJRASET), 2021. DOI: http://dx.doi.org/10.22214/ijraset.2021.36339
Customers will be able to reserve their vehicles from anywhere in the world due to the Car Rental System. Consumers provide information to this application by filling in their personal information. When a consumer creates an account on the website, he or she can reserve a car. The proposed system is an online system that is fully integrated. It effectively and efficiently automates manual procedures. Customers are aided by this automated method, which allows them to fill in the specifics according to their needs. It contains information on the sort of car they want to hire as well as the location. The goal of this system is to create a website where customers can book their automobiles and request services from anywhere in the world. There are three phases to this car rental system mentioned in the introduction.
Category: Data Structures and Algorithms
[382] viXra:2108.0119 [pdf] submitted on 2021-08-23 13:15:50
Authors: Mirzakhmet Syzdykov
Comments: 5 Pages.
Membership Problem in Non-deterministic Finite Automata for Extended Regular
Expressions in Linear Polynomial Time
Category: Data Structures and Algorithms
[381] viXra:2107.0030 [pdf] submitted on 2021-07-05 20:22:23
Authors: Victor Porton
Comments: 3 Pages.
My proof of P ≠ NP ⇔ NP = EXPTIME.
Category: Data Structures and Algorithms
[380] viXra:2106.0132 [pdf] submitted on 2021-06-23 13:58:05
Authors: Boris Zavlin
Comments: 6 Pages.
We determine the lower bound for arbitrarily aligned perimeter and area Minimal Enclosing Rectangle (MER) problems to be Omega(n log n) by using a reduction technique for a problem with this known lower bound.
Category: Data Structures and Algorithms
[379] viXra:2106.0026 [pdf] submitted on 2021-06-05 19:05:58
Authors: Arjun Dahal
Comments: 3 Pages.
Abstraction, Indexing Process, and Identification are one of important topics, which if explained would be bluffy, but are essential for Bibliographical Methods, and the way we do handle any data, for conviniency in making archival records. Anything, when we do rely on past tense as per our grammer would be archived materials, beginning from our thoughts as well as works. Thus, In this paper, we shall attempt to discuss, how we can make our archives of published materials more systematic in online as well as offline medium.
Category: Data Structures and Algorithms
[378] viXra:2106.0007 [pdf] submitted on 2021-06-03 17:21:38
Authors: UnChol Ri, ChungHyok Kim, BomIl Kim, GilWun Mun, GumChol Ri
Comments: 4 Pages.
In this paper, a study on the mass conversion of file names under various conditions that has been raised as a difficult problem in the use of computers is described. This method can deal with the problem of converting filenames that cannot be solved with conventional methods in the fastest time possible.
This method consists of three main steps.
That is, it consists of the process of obtaining the file name, the process of converting the file name according to the required request using the file name conversion formula in EXCEL, and the process of finally converting the file using the converted file name according to the conversion formula. This method is much easier than the previously described conversion method and has the advantage that it can be operated without error on any computer, providing a new premise in future research for data processing in computers.
Category: Data Structures and Algorithms
[377] viXra:2106.0002 [pdf] submitted on 2021-06-01 13:01:35
Authors: Janko Kokosar
Comments: 9 Pages.
The article provides a proposal for a database of all 123+ pairs of highways entrances in Slovenia. This database can be composed of hyperlinks to Google Street View. This would help prevent wrong-way driving on highways. Next, it is an example of hyperlinks for 15 such pairs, they are at the intersection of the A2 and E61 highways. This database can then be made even more transparent, and even more quickly accessible. This principle can be extended to all world highways. The usefulness of such a database is also argued. I hope for help from volunteers, Google, Via Michelin, DARS, and others.
Category: Data Structures and Algorithms
[376] viXra:2104.0112 [pdf] submitted on 2021-04-19 20:36:52
Authors: Domenico Oricchio
Comments: 1 Page.
An electronic label for scientific articles.
Category: Data Structures and Algorithms
[375] viXra:2103.0090 [pdf] submitted on 2021-03-13 23:48:31
Authors: Stephen P. Smith
Comments: 5 Pages.
This paper describes two optimization methods that use all first derivatives, and a subset of second derivatives, all of which are available with backward differentiation. The first method is Newton’s method on a direction set that changes dynamically during iteration. The second method is a quasi-Newton method that approximates the inverse Hessian matrix using a subset of second derivatives.
Category: Data Structures and Algorithms
[374] viXra:2103.0041 [pdf] submitted on 2021-03-07 20:27:29
Authors: Evan R.M. Debenham, Roberto Solis-Oba
Comments: 20 Pages. Published in IJCSIT, 13(1), 2021 [Corrections are made by viXra Admin to comply with the rules of viXra.org]
This paper presents new algorithms for Field of Vision (FOV) computation which improve on existing work at high resolutions. FOV refers to the set of locations that are visible from a specific position in a scene of a computer game.
We review existing algorithms for FOV computation, describe their limitations, and present new algorithms which aim to address these limitations. We first present an algorithm which makes use of spatial data structures in a way which is new for FOV calculation. We then present a novel technique which updates a previously calculated FOV, rather than re-calculating an FOV from scratch.
We compare our algorithms to existing FOV algorithms and show they provide substantial improvements to running time. Our algorithms provide the largest improvement over existing FOV algorithms at high resolutions, thus allowing the possibility of the creation of high resolution FOV-based video games.
Category: Data Structures and Algorithms
[373] viXra:2102.0160 [pdf] submitted on 2021-02-26 22:00:04
Authors: Andrew Holster
Comments: 102 Pages.
CAT4 is proposed as a general method for representing information, enabling a powerful programming method for large-scale information systems. It enables generalised machine learning, software automation and novel AI capabilities. This is Part 3 of a five-part introduction. The focus here is on explaining the semantic model for CAT4. Points in CAT4 graphs represent facts. We introduce all the formal (data) elements used in the classic semantic model: sense or intension (1st and 2nd joins), reference (3rd join), functions (4th join), time and truth (logical fields), and symbolic content (name/value fields). Concepts are introduced through examples alternating with theoretical discussion. Some concepts are assumed from Part 1 and 2, but key ideas are re-introduced. The purpose is to explain the CAT4 interpretation, and why the data structure and CAT4 axioms have been chosen, to make the semantic model consistent and complete. We start with methods to translate information from database tables into graph DBs and into CAT4. We conclude with a method for translating natural language into CAT4. We conclude with a comparison of the system with an advanced semantic logic, the hyper-intensional logic TIL, which also aims to translate NL into a logical calculus. The CAT4 Natural Language Translator is discussed in further detail in Part 4, when we introduce functions more formally. Part 5 discusses software design considerations.
Category: Data Structures and Algorithms
[372] viXra:2102.0119 [pdf] submitted on 2021-02-19 21:01:49
Authors: Szostek Roman
Comments: 24 Pages. New sports playing system (algorithm) called R-Sport [Corrections made by viXra Admin to conform with the requirements on the Submission Form]
The aim of sports competitions is to select the best team, i.e. the champion, from a group of teams or players. Therefore, matches must be played between the individual teams. The results of all matches decide who becomes the champion. The rules of the season form the Sports Playing System. This document describes the new Sports Playing System. This system allows the fair and efficient selection of the winner of the entire season. It has advantages that other well-known and currently used Sports Playing Systems do not have. R-Sport can be used in classic sports as well as in e-sports. R-Sport is a Sports Playing System that allows conducting league matches in many ways.
Category: Data Structures and Algorithms
[371] viXra:2102.0020 [pdf] submitted on 2021-02-04 20:34:07
Authors: Hamidreza Seiti, Ahmad Makui, Ashkan Hafezalkotob, Mehran Khalaj, Ibrahim A. Hameed
Comments: 48 Pages.
Various unexpected, low-probability events can have short or long-term effects on organizations and the global economy. Hence there is a need for appropriate risk management practices within organizations to increase their readiness and resiliency, especially if an event may lead to a series of irreversible consequences. One of the main aspects of risk management is to analyze the levels of change and risk in critical variables which the organization's survival depends on. In these cases, an awareness of risks provides a practical plan for organizational managers to reduce/avoid them. Various risk analysis methods aim at analyzing the interactions of multiple risk factors within a specific problem. This paper develops a new method of variability and risk analysis, termed R.Graph, to examine the effects of a chain of possible risk factors on multiple variables. Additionally, different configurations of risk analysis are modeled, including acceptable risk, analysis of maximum and minimum risks, factor importance, and sensitivity analysis. This new method's effectiveness is evaluated via a practical analysis of the economic consequences of new Coronavirus in the electricity industry.
Category: Data Structures and Algorithms
[370] viXra:2101.0033 [pdf] submitted on 2021-01-05 01:30:52
Authors: Amr Abdellatif
Comments: 53 Pages.
We explored the requirement of proven features for real-time use in web browsers, adopting a linear SVM based face detection model as a test case to evaluate each descriptor with appropriate parameters. After checking multiple feature extraction algorithms, we decided to study the following four descriptors Histogram of oriented
gradients, Canny edge detection, Local binary pattern, and Dense DAISY . These four descriptors are used in various computer vision tasks to oer a wide range of options. We then investigated the influence of different parameters as well as
dimension reduction on each descriptor computational time and its ability to be processed in real-time. We also evaluated the influence of such changes on the accuracy of each model.
Category: Data Structures and Algorithms
[369] viXra:2101.0032 [pdf] submitted on 2021-01-05 01:35:29
Authors: Hao Liu
Comments: 60 Pages.
Machine learning continues to be an increasingly integral component of our lives, whether we are applying the techniques to research or business problems. Machine learning models ought to be able to give accurate predictions in order to create
real value for a given organization. At the same time Machine learning training by running algorithms on the browser has gradually become a trend. As the closest link to users in the Internet, the web front-end can also create a better experience for our users through AI capabilities. This article will focus on how to evaluate machine learning algorithms and deploy machine learning models in the browser.We will
use "Cars", "MNIST" and "Cifar-10" datasets to test LeNet, AlexNet, GoogLeNet and ResNet models in the browser. On the other hand, we will also test emerging lightweight models such as MobileNet. By trying, comparing, and comprehensively evaluating regression and classification tasks, we can summarize some excellent methods/models and experiences suitable for machine learning in the browser.
Category: Data Structures and Algorithms
[368] viXra:2101.0031 [pdf] submitted on 2021-01-05 01:40:34
Authors: Nikolai-Iraj Sanamrad
Comments: 73 Pages.
The problem of authenticating online users can be divided in two general sub problems: confirming two different web users being the same person, and confirming two different web users being not the same person. The easiest and most accessible
method for fingerprinting used by online services is browser fingerprinting. Browser fingerprinting distinguishes between user devices using information about the browser and the system, which is usually provided by most browsers to any
website, even when cookies are turned off. This data is usually unique enough even for identical devices, since it heavily relies on device usage by the user. In this work, browser fingerprinting is being improved with the usage of information
acquired from an eye tracker.
Category: Data Structures and Algorithms
[367] viXra:2101.0030 [pdf] submitted on 2021-01-05 01:43:08
Authors: Oliwia Oles
Comments: 57 Pages. Language is german
Diese Arbeit beschäftigt sich mit der Ertsellung und Bewertung eines Frameworks zur Durchführung und Datenerfassung von Studien, wie beispielswweise eine Blickpostionsvorhersage oder Emotionserkennung. Der Eyetracker basiert auf der
webgazer.js Biblitohek der Brown Universität und die Emotionserkennung beruht auf einer FrontEnd-Emotionserkennungs Bibliothek auf Grundlage von der Google Shape Detection API. Beide Bibiliotheken, wurden implementiert, um sie für Studien zu gebrauchen. Zusätzlich zu diesen Daten werden auch noch Angaben wie das Alter
und das Geschlecht, jedoch nur auf freiwilliger Basis, erfasst, um mehr Aussagekraft zu erhalten. Anschließend wird das Framework getestet und bewertet, um zu erfahren, ob es genauso gut in der Praxis wie in der Theorie funktioniert, da es sich
hierbei um komplett clientseitige Anwendungen handelt. Im Normalfall benötigen solche Anwendungen nämlich noch zusätzliche Hardware, um zu funktionieren. Die verwendeten Biblitoheken kommen jedoch ohne diese aus, fordern nur eine
funktionierende Webcam und bestehen nur auf JavaScript Code. Deshalb stellt sich hier nun die Frage, ob eine rein clientseitige Anwendung vergleichbar gut mit anderen Anwednungen in diesen Bereichen funktioniert und ob sich die Daten gut
erfassen lassen.
Category: Data Structures and Algorithms
[366] viXra:2101.0029 [pdf] submitted on 2021-01-05 01:44:51
Authors: Roufayda Salaheddine
Comments: 67 Pages. Language is german
Aufgrund der zunehmenden Nutzung und Felxibilität des Internets, hat sich der Einsatz von Webanwendungen in vielen Anwendungsgebieten etabliert. Vor allem in der empirischen Forschung, z.B. in Bereichen des maschinellen Lernens oder der Mensch-Computer-Interaktion, findet eine steigende Integration vonWebanwendungen
in den Studienbetrieb statt, um schnell und flexibel große Datensätze verwalten und auswerten zu können. Hierzu ist es wichtig, eine Datenhaltungssoftware so zu gestalten, dass eine qualitativ hochwertige und effziente Speicherung der erhobenen Studiendaten gewährleistet wird. Das Ziel dieser Arbeit ist die Entwicklung und
Bewertung einer Webplattformbasierten Datenhaltungs- und Bewertungssoftware für Eye-Tracking- und Emotion-Detection-Studien. Dazu wurde eine auf dem Client-Server-Modell basierende Webplattform erstellt, die Probandendaten sammelt und anschließend in einer Datenbank verwaltet. Darauf aufbauend wird im Rahmen dieser Arbeit diese Software unter Betrachtung verschiedener Anforderungen evaluiert und bewertet.
Category: Data Structures and Algorithms
[365] viXra:2012.0211 [pdf] submitted on 2020-12-29 12:34:50
Authors: Neel Adwani
Comments: 3 Pages.
This paper proposes a whole new concept in the field of Cryptography, i.e., EsoCiphers. Short for Esoteric Cipher, EsoCipher is an algorithm, which can be understood by a few, who have the knowledge about its backend architecture. The idea behind this concept is derived from esoteric programming languages, but EsoCiphers will be having a practical use in the future, as more research is done on the topic. Although the algorithm is quite simple to understand, the complexity of the output will indeed prove to be difficult to brute-force if a secret is embedded in it. It uses a hybrid cryptography-based technique, which combines ASCII, Binary, Octal, and ROT 13 ciphers. The implementation and similarity index has been provided to show that it can be implemented practically.
Category: Data Structures and Algorithms
[364] viXra:2012.0177 [pdf] submitted on 2020-12-24 20:20:17
Authors: K. S. Ooi
Comments: 8 Pages.
Python slice syntax is nothing new. We find such syntax in other programming languages. However, string slicing in Python is usually covered briefly in standard references, Python textbooks as well as favorite websites. In this article, the author attempts to explore every nook and cranny of string slicing in Python. Besides proving a string slicing theorem, the author explores the limitation of string slicing expressed in everyday languages, the default values of all three arguments of the slice, the minimal string slice statement, the meaning of negative step, the influence of step on other two arguments, and two suggestions to correctly predict the substring of a slice.
Category: Data Structures and Algorithms
[363] viXra:2011.0123 [pdf] submitted on 2020-11-16 09:26:55
Authors: Daher Al Baydli
Comments: 14 Pages. Preprint paper
The aim of this paper is to give a computational treatment to compute the cup iproduct and Steerod operations on cohomology rings of as many groups as possible.
We find a new method that could be faster than the methods of Rusin, Vo, and Guillot. There are some available approaches for computing Steenrod operations on these cohomology rings. The computation of all Steenrod squares on the Mod 2 cohomology of all groups of order dividing 32 and all but 58 groups of order 64; partial information on Steenrod square is obtained for all but two groups of order 64. For groups of order 32 this paper completes the partial results due to Rusin, Thanh Tung Vo and Guillot.
Category: Data Structures and Algorithms
[362] viXra:2010.0244 [pdf] submitted on 2020-10-30 08:56:21
Authors: Chun Lin
Comments: 7 Pages. [Corrections made by viXra Admin to conform with the requirements on the Submission Form]
The problem of LCS (longest common subsequence) and LIS (longest increasing subsequence) are both well-solved; the former was proved to be solvable in $O(nm/log{n})$ \cite{masek} and the later $O(nlog{log{n}})$ \cite{segal}. Recently, the problem LCIS (longest common increasing subsequence) was proposed. While it can be seen as a combination of the two aforementioned problems, it seems difficult to adapt their solutions to the LCIS problem. Most of the approaches to LCIS utilizes dynamic programming and sophisticated data structure to speed up, while we use a simple algorithm to attack the case where the alphabet size is extremely small comparing to the sequence lengths.
Category: Data Structures and Algorithms
[361] viXra:2010.0188 [pdf] submitted on 2020-10-23 10:38:56
Authors: Eliahu Shaikevich
Comments: 10 Pages. [Corrections made to conform with the requirements on the Submission Form]
There is such a historical number of trades N0 of both parities and such a way of placing them on the market that for any N > N0 the following inequality
always holds: M+(i) ≥ M−(i) + V−(i). we call it the Cover Theorem.
Category: Data Structures and Algorithms
[360] viXra:2010.0059 [pdf] submitted on 2020-10-09 20:02:43
Authors: Eren Unlu
Comments: 5 Pages.
A recently proposed temporal correlation based network framework applied on financial markets called Structural Entropy has prompted us to utilize it as a means of analysis for COVID-19 fatalities across countries. Our observation on the resemblance of volatility of fluctuations of daily novel coronavirus related number of deaths to the daily stock exchange returns suggests the applicability of this approach.
Category: Data Structures and Algorithms
[359] viXra:2009.0077 [pdf] submitted on 2020-09-11 09:12:55
Authors: Farzad Didehvar
Comments: 9 Pages.
We have shown the plausibility of considering time as a Fuzzy concept instead of classical time [7], [8]. By considering time as a fuzzy concept, we will have new classes of Complexity. Here, we show that how some famous problems will be solved in this new picture.
Category: Data Structures and Algorithms
[358] viXra:2009.0076 [pdf] submitted on 2020-09-11 09:13:21
Authors: Farzad Didehvar
Comments: Pages.
In a series of articles we try to show the need of a novel Theory for Theory of
Computation based on considering time as a Fuzzy concept. Time is a central concept
In Physics. First we were forced to consider some changes and modifications in the
Theories of Physics. In the second step and throughout this article we show the positive
Impact of this modification on Theory of Computation and Complexity Theory to rebuild
it in a more successful and fruitful approach. We call this novel Theory TC*.
Category: Data Structures and Algorithms
[357] viXra:2009.0035 [pdf] submitted on 2020-09-04 11:06:02
Authors: Eliahu Shaykevich
Comments: 10 Pages.
There is such a historical number of deals N0 of both parities and such a way of placing them on the market that for any N> 0 the inequality M + () ≥− () + − () (9) we will call it the "Shaykevich Inequality" or the Covering Theorem (CoverTheorem)
Category: Data Structures and Algorithms
[356] viXra:2008.0138 [pdf] submitted on 2020-08-19 11:01:55
Authors: Alexander Garyfallos
Comments: 26 Pages.
Forecasting forthcoming "health events" is an extremely challenging task for the Remote Patient Monitoring systems (RPM systems) sector, which relies in real time information and communication technologies. Remote patient monitoring is a medical service which includes following and observing patients that are not in the same location with their health care provider. In general, the patient is equipped with a “smart” monitoring device, and the recorded data (vital signs) are securely transmitted via telecommunication networks to the health care provider. Modern remote patient monitoring devices are small, discrete and easy to wear, allowing "bearers" to move freely and with comfort. In this framework, MOKAAL pc has developed the IFS_RPM service (Integrated Facilitation Services for Remote Patient Monitoring) supplying the necessary ICT infrastructure, which is necessary for the provision of the RPM services. Following the completion of IFS_RPM project, MOKAAL pc launched a research project under the code name "PROPHETTM" .
PROPHETTM main objective is to investigate the possibilities of introducing a real time predicting model based on remotely collected vital signs, that would utilize time series of metric data in conjunction with the information stored in the Electronic Health Records (EHR) of the "bearer", attempting to predict in real time, the probability of a "health event" occurring in the near future. To meet this objective, the PROPHETTM project team designed an evolutionary prototype of the "health event" forecasting model, which was developed and tested in a laboratory environment and it will be upgraded to a working prototype to be tested in real conditions, in order to be incorporated into the IFS_RPM system, after reaching its maturity state.
Category: Data Structures and Algorithms
[355] viXra:2008.0122 [pdf] submitted on 2020-08-16 20:36:30
Authors: James Dow Allen
Comments: 12 Pages.
The theoretical minimum storage needed to represent a set of size N drawn from a universe of size M is about N·(log[sub]2[/sub](M/N) + 1.4472) bits (assuming neither N nor M/N is very small). I review the technique of `quotienting' which is used to approach this minimum, and look at the actual memory costs achieved by practical designs. Instead of somehow implementing and exploiting 1.4472 bits of steering information, most practical schemes use two bits (or more). In the conclusion I mention a scheme to reduce the overhead cost from two bits to a single trit.
Category: Data Structures and Algorithms
[354] viXra:2008.0092 [pdf] submitted on 2020-08-13 07:49:42
Authors: Rahul Kumar Singh, Sanjeev Saxena
Comments: 4 Pages.
In this note, we present a simpler algorithm for joint seat allocation problem in case there are two or more merit lists. In case of two lists (the current situation for Engineering seats in India), the running time of the algorithm is proportional to sum of running time for two separate (delinked) allocations. The algorithm is straight forward and natural and is not (at least directly) based on deferred acceptance algorithm of Gale and Shapley. Each person can only move higher in his or her preference list. Thus, all steps of the algorithm can be made public. This will improve transparency and trust in the system.
Category: Data Structures and Algorithms
[353] viXra:2008.0073 [pdf] submitted on 2020-08-11 19:36:48
Authors: Leslie Nneji Ndugbu
Comments: 13 Pages.
This work is concerned with road traffic offence information management in Nigeria. It focused on
trends in road traffic offences information and carried out a critical review of current information and communication and technology compliance state of FRSC with a view to identifying its defects in road traffic offence information management. A system to correct road traffic offence information management failure as identified in the existing system was then proposed. Road traffic offence records and details of current safety measures obtained from FRSC and online in addition to research works
provided the basic data for the study. The result showed the high rate of road traffic offence as a result of poor road traffic offence information management and failure to improve on the existing road traffic information management.
Category: Data Structures and Algorithms
[352] viXra:2007.0143 [pdf] submitted on 2020-07-18 12:35:11
Authors: Robert S. Adlemir, Robert S. Adlemir, Chris K. Wong
Comments: 5 Pages.
We identify a major security flaw in modern gift transaction protocol that allows for malicious entities to send questionable metadata to insecure1 recipients. To address these
weaknesses we introduce the Blockcard protocol, a novel variant of Blockchain technology
that uses an asymmetric proof-of-work CPU cost function over payload metadata to provide
a cryptographically secure and efficient method of verifying that gift-givers thought enough
about the recipients payload or lack thereof for it to count. This has the advantage of
making it computationally infeasible and socially awkward for adversarial gift-givers to
double-spend, spoof, or precompute their celebratory thoughts.
Category: Data Structures and Algorithms
[351] viXra:2007.0082 [pdf] submitted on 2020-07-13 10:58:50
Authors: Jeff Linahan
Comments: 5 Pages.
We discuss an exception handling optimization that achieves zero overhead in both space and time compared to ordinary C-style error handling control flow when the compiler can see which catch block a given throw expression will land in. The technique brings exceptions more in line with the design goals of C++, reducing the need for alternate error handling mechanisms.
Category: Data Structures and Algorithms
[350] viXra:2007.0063 [pdf] submitted on 2020-07-10 15:47:36
Authors: Karim Baina, Boualem Benatallah
Comments: 20 Pages.
With the disruption produced by extensive au-
tomation of automation due to advanced research in machine learning, and auto machine learning, even in programming language translation [Lachaux et al., 2020] the main goal of this paper is to discuss the following question "Is it still worth
the cost to teach compiling in 2020 ?". Our paper defends the "Yes answer" within software engineering majors. The paper also shares the experience of teaching compiling techniques course
best practices since more than 15 years, presents and evaluates this experience through Hortensias , a pedagogical compiling laboratory platform providing a language compiler and a virtual
machine. Hortensias is a multilingual pedagogical platform for learning end teaching how to build compilers front and back-end.
Hortensias language offers the possibility to the programmer to customise the compiler associativity management, visualise the intermediary representations of compiled code, or customise the
optimisation management, and the error management language for international students communities. Hortensias offers the possibility to the beginner programmer to use a graphical user interface to program by clicking. Hortensias compiling
pedagogy evaluation has been conducted through two surveys involving in a voluntarily basis engineering students and alumni during one week. It targeted two null hypothesis : the first null
hypothesis supposes that compiling teaching is becoming outdated with regards to current curricula evolution, and the second
null hypothesis supposes Hortensias compiling based pedagogy has no impact neither on understanding nor on implementing
compilers and interpreters. During fifteen years of teaching compiler engineering, Hortensias was a wonderful pedagogic experiment either for teaching and for learning, since vulgarising
abstract concepts becomes very easier for teachers, lectures follow a gamification-like approach, and students become efficient in
delivering versions of their compiler software product in a fast pace.
Category: Data Structures and Algorithms
[349] viXra:2007.0008 [pdf] submitted on 2020-07-01 12:58:38
Authors: Ortho Flint
Comments: 3 Pages.
The deterministic polynomial time algorithm that determines satisfiability of 3-SAT can be generalized for SAT.
Category: Data Structures and Algorithms
[348] viXra:2006.0245 [pdf] submitted on 2020-06-26 16:07:53
Authors: Dibyendu Baksi
Comments: 11 Pages.
The covid-19 crisis is providing a lot of impetus to the search for innovative technological solutions to solve major problems of tracking and containment of the pandemic. The major cornerstones of testing, isolation, contact tracing and quarantine are well understood and agreed upon at a general level. In this paper, the software architecture required for implementing successful digital contact tracing applications is elaborated. The goal of contact tracing is to proactively identify the infection chain of the population including asymptomatic people coming in contact with infected people who tested positive, i.e., to avoid asymptomatic people from spreading the disease without any intention. The entire ecosystem of contact tracing is explained so that the real challenges of integrating the key healthcare components are appreciated.
Category: Data Structures and Algorithms
[347] viXra:2006.0086 [pdf] submitted on 2020-06-10 02:21:04
Authors: Karim Baina
Comments: 29 Pages.
Epidemiologist, Scientists, Statisticians, Historians, Data engineers and Data scientists are working on finding descriptive models
and theories to explain COVID-19 expansion phenomena or on building analytics predictive models for learning the apex of COVID-19 confirmed cases, recovered cases, and deaths evolution time series curves.
In CRISP-DM life cycle, 75% of time is consumed only by data preparation phase causing lot of pressures and stress on scientists and data
scientists building machine learning models. This paper aims to help reducing data preparation efforts by presenting detailed data preparation
repository with shell and python scripts for formatting, normalising, and integrating Johns Hopkins University COVID-19 daily data via three normalisation user stories applying data preparation at lexical, syntactic & semantics and pragmatic levels, and four integration user stories through geographic, demographic, climatic, and distance based similarity dimensions, among others. This paper and related open source repository will help data engineers and data scientists aiming to deliver results in an agile analytics life cycle adapted to critical COVID-19 context.
Category: Data Structures and Algorithms
[346] viXra:2006.0043 [pdf] submitted on 2020-06-05 01:32:34
Authors: Ekesh Kumar
Comments: 2 Pages.
The knapsack problem is a problem in combinatorial optimization that seeks to maximize an objective function subject to the a weight constraint. We consider the stochastic variant of this problem in which $\mathbf{v}$ remains deterministic, but $\mathbf{x}$ is an $n$-dimensional vector drawn uniformly at random from $[0, 1]^{n}$. We establish a sufficient condition under which the summation-bound condition is almost surely satisfied. Furthermore, we discuss the implications of this result on the deterministic problem.
Category: Data Structures and Algorithms
[345] viXra:2006.0040 [pdf] submitted on 2020-06-05 06:30:57
Authors: Arun Jose
Comments: 5 Pages.
This paper examines the feedback cycle of news ratings and
electoral polling, and offers an algorithmic news algorithm to
patch the problem. The cycle hinges on overexposure of a
candidate to familiarize their name in otherwise apathetic voters,
and therefore, the algorithm weighs down exposure on a
logarithmic scale to only pass increasingly important news as
coverage of a candidate inflates.
This problem is a symptom of a deeper issue, and the solution
proposes to patch it for the present, as well as offer insight into the
machinations of the issue, and therefore aid its understanding
Category: Data Structures and Algorithms
[344] viXra:2006.0017 [pdf] submitted on 2020-06-02 07:29:39
Authors: Sanjeev Saxena
Comments: 4 Pages.
In this note, a simple description of zone theorem in three dimensions is given.
Category: Data Structures and Algorithms
[343] viXra:2005.0292 [pdf] submitted on 2020-05-31 20:28:59
Authors: Henri Aare, Peter Vitols
Comments: 9 Pages.
The distributed ledger technology has been widely hailed as the break- through technology. It has realised a great number of application scenarios, and improved workflow of many domains. Nonetheless, there remain a few major concerns in adopting and deploying the distributed ledger technology at scale. In this white paper, we tackle two of them, namely the through- put scalability and confidentiality protection for transactions. We learn from the existing body of research, and build a scale-out blockchain plat- form that champions privacy called RVChain. RVChain takes advantage of trusted execution environment to offer confidentiality protection for trans- actions, and scale the throughput of the network in proportion with the number of network participants by supporting parallel shadow chains.
Category: Data Structures and Algorithms
[342] viXra:2005.0145 [pdf] submitted on 2020-05-12 23:52:43
Authors: George Plousos
Comments: 6 Pages.
One of the objectives of this article is to contribute to the further development and improvement of similar algorithms. I will first outline the steps that gradually lead to this algorithm and give some instructions on how to use it. There are several experimentation possibilities that can lead to improved performance. This also depends on the technical characteristics of the computer on which the program will run.
Category: Data Structures and Algorithms
[341] viXra:2005.0108 [pdf] submitted on 2020-05-09 05:04:06
Authors: Abhinav Sagar
Comments: 23 Pages.
Collision detection is the computational problem of detecting the
intersection of two or more objects. While collision detection is most often
associated with its use in video games and other physical simulations, it
also has applications in robotics. In addition to determining whether two
objects have collided, collision detection systems may also calculate time
of impact, and report a contact manifold. Collision detection is one of the
most challenging and complex parts of game programming and is the key
area where performance is usually lost. To solve this there are a lot of data
structures that eliminate unnecessary checks for collisions like quadtrees,
octrees, BSP trees, grids etc.
Category: Data Structures and Algorithms
[340] viXra:2004.0626 [pdf] submitted on 2020-04-27 05:07:50
Authors: Asha Ali Juma
Comments: 4 Pages.
Block-chain is a technology that introduce the decentralized and distributed ledger that
stores the records which can be accessed publicly. The records or data are stored in blocks
which are chained chronologically to preserve the integrity of the data stored in it.
any kind of database, the blocks that store the data have two hashes, the hash of the block
before it, and the hash of its own which is treated as the fingerprint for the block and
protect it from being tempered. The temper-proof and the distributed
the block-chain technology have made it to be the most used technology in preserving the
integrity and increase the transparency of data in various industries. One of the advantage
of using the technology is that despite the data being accessed by anyone in the network,
the block that stores the data is immutable. Many misconceptions arose regarding the
medical records privacy, risk to the transparency, procedures and other related health issues
on the procedures. This paper will review the impact of the block chain technology in
electronic medical services to address the issue
Category: Data Structures and Algorithms
[339] viXra:2004.0605 [pdf] submitted on 2020-04-26 10:38:18
Authors: Vinay Gupta
Comments: 5 Pages.
According to abstract we are working on multiple cloud system for storing more data in the cloud or sharing of huge information from one place to another place with minimum cost. According to that we are trying to this project reduce same data to did not store in the cloud that why we are using multiple cloud application or websites to store more data because every cloud is providing minimum space in the cloud & over data is time to time increases. So that we are using MapReduce technology to get same to store different-different data or marge the as a same file of data or Map scale back also identify for checked is their same data or what because we are using of multiple cloud storage to take more space. if data is similar so we are trying to use & identify which gateway server or LBA (Logical Block area) are store as same data of files. This method mainly from uses of multiple big organization, government, Telecom Industry and much more. A hash capability created use of session middle of the topology among minimizes the task all the same is not movement valued in this topology. Why we are working on network topology because multiple customer is working or uses of internet in the same time to reduce the network error or control the network traffic without taking extra time or reduce the network traffic values.
Category: Data Structures and Algorithms
[338] viXra:2004.0515 [pdf] submitted on 2020-04-22 11:48:45
Authors: Bahram Kalhor, Farzaneh Mehrparvar
Comments: 13 Pages.
Although many methods have been designed for ranking universities, there is no suitable system that focuses on the ranking of countries based on the performance of their universities. The overall ranking of the universities in a region can indicate the growth of interests in science among the people of that land. This paper introduces a novel ranking mechanism based on the rankings of universities. Firstly, we introduce and discuss two new rankings of countries, based on the rank of their universities. Secondly, we create rankings of countries according to the selected method, based on the top 12000 universities in webometrics.info (January 2012) and compare rankings of countries in 4 editions (January 2012 to July 2013). Firstly, we introduce two new methods of ranking countries based on their university rankings, Weighted Ranking (WR) and Average Ranking (AR). Secondly, we discuss how the introduced ranking systems, perform in ranking countries based on the two years of data. Thirdly, we choose QS (http://www.topuniversities.com) and webometrics.info as two different classification systems for comparing rankings of countries, based on the top 500 universities in these rankings. Results indicate that the methodology can be used to show the quality of the whole universities of each country used to compare rankings of countries in practice compare to other countries in the world.
Category: Data Structures and Algorithms
[337] viXra:2004.0457 [pdf] submitted on 2020-04-19 11:16:01
Authors: Meghana Prakash, Vignesh S
Comments: 4 Pages.
Cloud computing is a collection of several computer resources that consists of both software and hardware. It is a type of service that is delivered over the internet and can be accessible from anywhere. [1] The data and services can be accessed through the internet. [4] These services are managed by the third-party over the internet. They eventually provide access to the servers and resources. Health records consist of patient’s data regarding health. This data is usable by both the hospitals and patients. [6] [8] This can be eventually used to track the medical history of patients. Data Visualization is a graphical depiction of the data. It implicates producing images that advertise the link among the data that the users view. Hence, they are used for clinical decision making. In this paper we will be discussing how cloud can be used to maintain health records electronically.
Category: Data Structures and Algorithms
[336] viXra:2004.0381 [pdf] submitted on 2020-04-15 12:51:27
Authors: Zeeshan Sharief
Comments: 2 Pages.
Searchable encryption allows a cloud server to conduct keyword search over encrypted data on behalf of the data users without learning the underlying plaintexts. However, most existing searchable encryption schemes only support single or conjunctive keyword search, while a few other schemes that can perform expressive keyword search are computationally inefficient since they are built from bilinear pairings over the composite-order groups. In this paper, we propose an expressive public-key searchable encryption scheme in the prime-order groups, which allows keyword search policies i.e., predicates, access structures to be expressed in conjunctive, disjunctive or any monotonic Boolean formulas and achieves significant performance improvement over existing schemes. We formally define its security and prove that it is selectively secure in the standard model. Also, we implement the proposed scheme using a rapid prototyping tool called Charm and conduct several experiments to evaluate it performance. The results demonstrate that our scheme is much more efficient than the ones built over the composite-order groups.
Category: Data Structures and Algorithms
[335] viXra:2004.0285 [pdf] submitted on 2020-04-12 05:08:49
Authors: Krishna B L
Comments: 9 Pages.
A Fileless Ransomware is a new type of ransomware primarily follows the
mechanism of both ransomware and fileless malware. Detecting and
Defending these kinds of attacks becoming a great obstacle for IT firms.
Cybercriminals found a new way of extorting ransom with vicious methods
mainly from big organizations, government, Telecom Industry and many more.
Traditional AV Engines are not able to defend Fileless Malware. This paper
describes the mechanism of both ransomware and fileless malware, the
working of fileless ransomware, what are the possible attack vectors of fileless
ransomware, variations of fileless ransomware and their instances, Prevention
methods and recommendation to defend against Fileless ransomware.
Category: Data Structures and Algorithms
[334] viXra:2004.0247 [pdf] submitted on 2020-04-10 16:37:50
Authors: Umaima Khan
Comments: 18 Pages. This article has been originally published at www.ijarbas.com European International Journal, Vol.#2, Issue:2, February 2020.
Windows and Linux both are operating system. Windows is the famous operating system in market but it is not safe than Linux. With growing concern over operating system security, Linux got famous in the familial market place with that safety and efficiency. Lots of companies have been migrated from Windows towards Linux. The shortage of Linux experts has limited the development of Linux. If cost is deliberated than better is Linux than Windows. Windows is suitable for small matrices while Linux is suitable for large matrices. The aim of this paper is to conduct the survey over Linux and Windows. Basically this paper is comparative in this paper we have compared different methodologies related to Windows and Linux that are used in different researches. The results of different experiments related to Windows and Linux have been compared. Different approaches have been presented in previous researches to solve problems related to Windows and Linux.
Keywords: Linux; Windows; operating system; virtual memory management;
Simulation Windows Synchronization Mechanism,
Category: Data Structures and Algorithms
[333] viXra:2004.0223 [pdf] submitted on 2020-04-10 11:44:43
Authors: Shouket Ahmad Kouchay
Comments: 6 Pages. RSA, blowfish algorithm, Cloud computing, encryption, decryption, data security
Cloud computing has revolutionized the IT world. Cloud computing is not only beneficial for everyday users but also for large enterprises, as it is capable of sharing large data in different forms and to safeguard that valuable data in secure storage, security management procedures must be implemented. Cloud computing is dynamically flexible as well as sizable and cost-efficient involving important virtual tools. There is not adequate security in usual asymmetric encryption, as single secret key in this algorithm could be hacked by some attackers. So there arises an improved data protection requirement. Different and effective ways to secure cloud computing has been introduced but the data remains at constant threat by security exploits. Secure storage in cloud computing is a fundamental move for the IT world. This research proposes an effective Cryptographic technique for securing the data in Cloud Computing that provides better security and accountability to data in the cloud. The primary analysis of the proposed technique demonstrates better performance results. This research also highlights many security issues in different encryption algorithms.
Category: Data Structures and Algorithms
[332] viXra:2004.0198 [pdf] submitted on 2020-04-08 14:35:22
Authors: Usama Kadri
Comments: 3 Pages.
Rapid testing of appropriate specimens from patients suspected for Coronavirus is of a great importance for the disease management and control. We propose complementary approaches to enhance processing large amounts of collected samples. The approaches are based on mixing samples in testing cups as opposed to testing single samples in each cup. As a result, the number of tests can be boosted by factors of magnitude, and the effective testing time can be reduced drastically.
Category: Data Structures and Algorithms
[331] viXra:2003.0680 [pdf] submitted on 2020-03-31 18:53:56
Authors: Luowen Qian
Comments: 2 Pages.
In this work, we introduce the nonuniform complexity class LD, which stands for
linear depth circuit. We also prove that LD = ALL, which is the complexity class
that contains all decision languages, and therefore resolving all questions on this new
complexity class. No further research is needed [Mun20].
In principle, this shows that anything can be computed via circuits in linear time, despite
with (possibly undecidable) pre-computation and very inefficient advice, however,
we note that exponential sized advice suffices to achieve ALL.
Category: Data Structures and Algorithms
[330] viXra:2003.0214 [pdf] submitted on 2020-03-10 16:36:59
Authors: Jason Reaves
Comments: 9 Pages.
YARA is a tool that has been pretty heavily adopted within the Cyber Security community, it was built to aid malware researchers with identifying and classifying malicious objects[1]. Instead of approaching this problem with a purely good or bad mindset in detecting malicious objects, we can utilize added functionality of YARA, namely tags, to approach the problem of judging how malicious or suspicious an object is by looking at the problem in smaller sets. This concept is commonly used in heuristic engines used by antiviruses and sandboxes where you can give a weight of maliciousness to an object. The aim of this paper is to introduce a method for such an engine to be built by an organization utilizing existing software.
Category: Data Structures and Algorithms
[329] viXra:2003.0039 [pdf] submitted on 2020-03-02 04:40:16
Authors: Elie Duthoo
Comments: 13 Pages. Creative Commons 3.0 Attribution-Noncommercial-ShareAlike license
The Voynich Manuscript (VMS) is an illustrated hand-written document carbon-dated in the early 15th century. This paper aims at providing a statistically robust method for translating voynichese, the language used in the VMS. We will first provide a set of statistical properties that can be applied to any tokenizable language with sub-token elements, apply it to Universal Dependencies (UD) dataset plus VMS (V101 transliteration) to see how it compares to the 157 corpora written in 90 different languages from UD. In a second phase we will provide an algorithm to map characters from one language to characters from another language, and we will apply it to the 158 corpora we have in our possession to measure its quality. We managed to attack more than 60% of UD corpora with this method though results for VMS don't appear to be usable.
Category: Data Structures and Algorithms
[328] viXra:2003.0016 [pdf] submitted on 2020-03-01 03:02:46
Authors: M. Hanefi CALP
Comments: 19 Pages.
Software testing process consists of activities that implemented after it is planned and including to document related testing activities. Test processes must be applied necessarily for able to clearly see the quality of software, the degree of reliability, whether it is ready for delivery, the degree of effectiveness and remains of how much testing. One of the most important phase of these processes is test planning activities. Test planning activities directly affects the project's success in software projects. In this study, software testing process, and test planning activities carried out in this process was first clearly demonstrated by the literature review. Later, a basic software testing process and test planning process were determined and explained step by step. In this way, it was aimed to give a different and deep perspective to the test planning activities and to raise awareness on the topic. As a result of the research, it was seen that the test planning activities are not applied adequately at present, software testers, experts or researchers do not have enough knowledge about the details of the method, and this situation causes very serious negative results in the delivery and cost in software projects. Finally, the topic was discussed in detail, and some conclusions and recommendations were given for the personnel in the field of software testing. In this point, the study is expected to contribute the literature.
Category: Data Structures and Algorithms
[327] viXra:2002.0586 [pdf] submitted on 2020-02-29 03:13:54
Authors: Andrei Lucian Dragoi
Comments: (TFM - IOJD - version 1.1 - 27.08.2018 - 8 pages)
This paper presents a practical method of generation and implementation of a large number of interconnected online/offline JavaScript (JS) 2D databases (IOJDs), using a file manager-like software built with Microsoft Visual Basic 6 (VB6) for their creation and maintenance (including periodic updates). Both this simple tag(-based) file manager (TFM) and IOJDs have many possible applications including medical databases and website cloning (WSC). Keywords: JavaScript (JS), Visual Basic 6 (VB6), tag(-based) file manager (TFM), databases, interconnected online/offline JavaScript 2D databases (IOJDs), website cloning (WSC);
Category: Data Structures and Algorithms
[326] viXra:2002.0415 [pdf] submitted on 2020-02-21 14:15:48
Authors: Aditya Das, Rishab Bhattacharyya, Pramit Ghosh
Comments: 4 Pages. This article is under review at IEEE Letters of the Computer Society for publication
This work is a proposed architectural prototype in the field of High Performance Computing (HPC). Intel Altera DE4 and Altera DE5a - Net FPGA boards were used as functional processors in our designed system. We further explore Peripheral Component Interconnect (PCI) Express communication and amalgamate the transfer of data through PCIe to two different kinds of FPGAs at the same time using a proposed scheduling algorithm called TF-PSST : Time First Power Second Scheduling Technique. This significantly improves efficiency of the system by reducing execution time and because of the heterogeneous nature of the architectural prototype, we also found a way to increase the hardware resource utilisation.
Category: Data Structures and Algorithms
[325] viXra:2002.0367 [pdf] submitted on 2020-02-19 14:01:36
Authors: Armando Alaminos-Bouza
Comments: 14 Pages. associated source code in C (C99) is available upon request.
A simple and fast functional model is proposed to approximate energy loss distributions
of charged particles crossing slabs of matter. The most accepted physical models for treating this
problem was created by Landau and later improved by Vavilov. Both models depend on complex
functional forms with exact solutions that are, by far, too CPU intensive to be directly included in
existing Monte Carlo codes. Several authors have proposed approximations with varying degrees of
accuracy and performance. This paper presents a compact and efficient form that approximates
with enough accuracy the Vavilov distribution and its extreme cases of Landau and Gaussian
shapes. Our functional form could be interpreted as a generalization of the basic Gaussian
distribution. Some parameter fits are illustrated with various test cases. Our model also
represents a simple functional form to use for regression analysis with experimental energy loss
data.
Category: Data Structures and Algorithms
[324] viXra:2002.0186 [pdf] submitted on 2020-02-10 04:19:27
Authors: Siddhant Ray
Comments: 9 Pages.
Software Defined Networking (SDN) is a concept in the area of computer networks in which the control plane and data plane of traditional computer networks are separated as opposed to the mechanism in conventional routers and switches. SDN aims to provide a central control mechanism in the network through a controller known as the SDN Controller. The Controller then makes use of various southbound Application Programming Interfaces (APIs) to connect to the physical switches located on the network and pass on the control information, which used to program the data plane. SDN Controller also exposes several northbound APIs to connect to the applications which can leverage the controller to orchestrate the network. The controller used with regard to this paper is the Open Network Operating System (ONOS), on which the algorithm in question is to be deployed. ONOS provides several APIs which is leveraged to connect the application to the network devices. The typical network path between any two endpoints is a shortest path fulfilling a set of constraints. The algorithm developed here is for optimal K-Shortest path searching in a given network satisfying specified constraints, controlled by ONOS.
Category: Data Structures and Algorithms
[323] viXra:2002.0183 [pdf] submitted on 2020-02-10 06:59:20
Authors: Jason Reaves
Comments: 15 Pages.
Obfuscation in malware is commonly employed for any number of reasons but it’s purpose is ultimately the same, to make the underlying malicious entity go unnoticed. Crypters and Packers both are heavily employed to bypass common security measures so ultimately these are just tools. Tools that are utilizing algorithms in order to take data and turn it into some other data while being able to reverse the process later, obviously these reversible algorithms can be chained together as well into ‘layers’. In this paper I explore the idea that it is easier to think of these layers as a math equation which can be solved. This has the potential of turning something that can be overwhelming at first, like writing an unpacker, into a much more manageable problem.
Category: Data Structures and Algorithms
[322] viXra:2001.0605 [pdf] submitted on 2020-01-28 06:59:38
Authors: Edimar Veríssimo da Silva
Comments: 18 Pages. We present the solution to the problem of the eight queens and the complete source code in Harbour language.
In this article we present the solution to the classic 8 queens problem which aims to look for states on a chessboard where 8 queens are positioned without one attack to another, that is, they should not share columns, rows or diagonals.
Category: Data Structures and Algorithms
[321] viXra:2001.0604 [pdf] submitted on 2020-01-28 07:58:57
Authors: Edimar Veríssimo da Silva
Comments: 6 Pages.
This article presents the problem of the symmetrical traveling salesman (the distance between the
city A → B is the same distance as the city B → A) and a non-deterministic algorithm to solve it
in some cases using time and feasible computational resources.
Category: Data Structures and Algorithms
[320] viXra:2001.0587 [pdf] submitted on 2020-01-27 15:50:11
Authors: Edimar Veríssimo da Silva
Comments: 5 Pages. Key words: expert systems, lisp, prolog, ops-5, julia, artificial intelligence.
This article presents LISP, PROLOG, OPS-5 and Julia as tools that can facilitate the development of applications with artificial intelligence, especially expert systems.
Category: Data Structures and Algorithms
[319] viXra:2001.0402 [pdf] submitted on 2020-01-19 19:33:35
Authors: Roman Bahadursingh
Comments: 3 Pages.
A Password P, can be defined as a hash of x symbols .A brute force password cracking algorithm will go through every possible combination of symbols from 1 – x symbols. This form of password cracker takes O(n) time to solve, where n is the number of possible combinations, achieved by sn where s is the number of symbols available for a password. Having a password cracker with multiple processors, having the processors instead of all checking from symbol 0 to the last symbol, using a more decentralized approach can greatly improve the speed of this computation to O(n/2) for two processors, O(n/3) for three processors and O(n/np) as a generalized formula. This algorithm also allows for multiple processors of different clock speeds to also crack a password in more optimal time.
Category: Data Structures and Algorithms
[318] viXra:2001.0295 [pdf] submitted on 2020-01-15 09:02:31
Authors: Domenico Oricchio
Comments: 1 Page.
I thought a method to preserve our scientific and cultural knowledge for future generations
Category: Data Structures and Algorithms
[147] viXra:2406.0050 [pdf] replaced on 2024-06-27 20:42:09
Authors: Chun-Hu Cui, He-Song Cui
Comments: 32 Pages.
In DeFi (Decentralized Finance) applications, and in dApps (Decentralized Application) generally, it is common to periodically pay interest to users as an incentive, or periodically collect a penalty from them as a deterrent. If we view the penalty as a negative reward, both the interest and penalty problemscome down to the problem of distributing rewards. Reward distribution is quite accomplishable in financial management where general computers are used, but on a blockchain, where computational resources are inherently expensive and the amount of computation per transaction is absolutely limited with a predefined, uniform quota, not only do the system administrators have to pay heavy gas fees if they handle rewards of many users one by one, but the transaction may also be terminated on the way. The computational quota makes it impossible to guarantee processing an unknown number of users. We propose novel algorithms that solve Simple Interest, Simple Burn, Compound Interest, and Compound Burn tasks, which are typical components of DeFi applications. If we put numerical errors aside, these algorithms realize accurate distribution of rewards to an unknown number of users with no approximation, while adhering to the computational quota per transaction. For those who might already be using similar algorithms, we prove the algorithms rigorously so that they can be transparently presented to users. We also introduce reusable concepts and notations in decentralized reasoning, and demonstrate how they can be efficiently used. We demonstrate, through simulated tests spanning over 128 simulated years, that the numerical errors do not grow to a dangerous level.
Category: Data Structures and Algorithms
[146] viXra:2406.0050 [pdf] replaced on 2024-06-15 21:55:57
Authors: Chun-Hu Cui, He-Song Cui
Comments: 32 Pages.
In DeFi (Decentralized Finance) applications, and in dApps (Decentralized Application) generally, it is common to periodically pay interest to users as an incentive, or periodically collect a penalty from them as a deterrent. If we view the penalty as a negative reward, both the interest and penalty problems come down to the problem of distributing rewards. Reward distribution is quite accomplishable in financial management where general computers are used, but on a blockchain, where computational resources are inherently expensive and the amount of computation per transaction is absolutely limited with a predefined, uniform quota, not only do the system administrators have to pay heavy gas fees if they handle rewards of many users one by one, but the transaction may also be terminated on the way. The computational quota makes it impossible to guarantee processing an unknown number of users. We propose novel algorithms that solve Simple Interest, Simple Burn, Compound Interest, and Compound Burn tasks, which are typical components of DeFi applications. If we put numerical errors aside, these algorithms realize accurate distribution of rewards to an unknown number of users with no approximation, while adhering to the computational quota per transaction. For those who might already be using similar algorithms, we prove the algorithms rigorously so that they can be transparently presented to users. We also introduce reusable concepts and notations in decentralized reasoning, and demonstrate how they can be efficiently used. We demonstrate, through simulated tests spanning over 128 simulated years, that the numerical errors do not grow to a dangerous level.
Category: Data Structures and Algorithms
[145] viXra:2404.0074 [pdf] replaced on 2024-12-06 21:55:14
Authors: Yuly Shipilevsky
Comments: 5 Pages.
We transform NP-complete Problem to the polynomial-time algorithm which would mean that P = NP.
Category: Data Structures and Algorithms
[144] viXra:2312.0019 [pdf] replaced on 2023-12-07 21:30:29
Authors: Hua Li, Lu Zhang, Ruoxi Guo, Zushang Xiao, Rui Guo
Comments: 10 Pages.
This paper introduces a watertight technique to deal with the boundary representation of surface-surface intersection in CAD. Surfaces play an important role in today’s geometric design. The mathematical model of non-uniform rational B-spline surfaces (NURBS) is the mainstream and ISO standard. In the situation of surface-surface intersection, things are a little complicated, for some parts of surfaces may be cut-off, so called trimmed surfaces occur, which is the central topic in the past decades in CAD community of both academia and industry. The main problem is that the parametric domain of the trimmed surface generally is not the standard square or rectangle, and rather, typically, bounded by curves, based on point inverse of the intersection points and interpolated. The existence of gaps or overlaps at the intersection boundary makes hard the preprocessing of CAE and other downstream applications. The NURBS are in this case hard to keep a closed form. In common, a special data structure of intersection curves must be affiliated to support downstream applications, while the data structure of the whole CAD system is not unified, and the calculation is not efficient. In terms of Bezier surface, a special case of NURBS, this paper designs a reparameterization or normalization to transform the trimmed surface into a group of Bezier surface patches in standard parametric domain [0,1]X[0,1]. And then the boundary curve of normalized Bezier surface patch can be replaced by the intersection curve to realize watertight along the boundary. In this way, the trimmed surface is wiped out, the "gap" between CAD and CAE is closed.
Category: Data Structures and Algorithms
[143] viXra:2212.0071 [pdf] replaced on 2023-05-10 05:55:08
Authors: Herman Schoeneld
Comments: 18 Pages.
A quantum-resistant, many-time signature scheme combining Winternitz and Merkle-Signature schemes is proposed. This construction is compatible with the Abstract Merkle Signature (AMS) Scheme and thus is an AMS-algorithm called "WAMS"
Category: Data Structures and Algorithms
[142] viXra:2212.0019 [pdf] replaced on 2023-05-10 05:59:01
Authors: Herman Schoenfeld
Comments: 17 Pages.
An abstract post-quantum digital signature scheme is presented that parameterizes a one-time signature scheme (OTS) for "many-time" use. This scheme permits a single key-pair to efficiently sign and verify a (great) many messages without security degradation. It achieves this by following the original Merkle-Signature Scheme but without a coupling to a specific OTS. Various improvements include a reduction in signature size, resistance to denial-of-service attacks and smaller keys. This construction comprises a bit-level specification for the Abstract Merkle Signature Scheme (AMS).
Category: Data Structures and Algorithms
[141] viXra:2208.0034 [pdf] replaced on 2022-08-27 20:15:11
Authors: Mirzakhmet Syzdykov
Comments: 3 Pages.
We state that if there's an order in the target function for the set of variables, then P != NP according to the dynamic programming which is the most optimal way of solving the combinatorial problems using its recurrent function at each step of the algorithm.
Category: Data Structures and Algorithms
[140] viXra:2208.0034 [pdf] replaced on 2022-08-25 18:38:36
Authors: Mirzakhmet Syzdykov
Comments: 3 Pages.
We state that if there's an order in the target function for the set of variables, then P != NP according to the dynamic programming which is the most optimal way of solving the combinatorial problems using its recurrent function at each step of the algorithm.
Category: Data Structures and Algorithms
[139] viXra:2207.0150 [pdf] replaced on 2024-07-30 06:23:37
Authors: Sanjeev Saxena
Comments: 10 Pages.
This note describes a very simple O(1) query time algorithm for finding level ancestors. This is basically a serial (re)-implementation of the parallel algorithm of Berkman and Vishkin(O.Berkman and U.Vishkin, Finding level-ancestors in trees, JCSS, 48, 214--230, 1994).Although the basic algorithm has preprocessing time of O(n log n), by having additional levels or using table lookup, the preprocessing time can be reduced to almost linear or linear.The table lookup algorithm can be built in O(1) parallel time with n processors and can also be used to simplify the parallel algorithm of Berkman and Vishkin and make it optimal.
Category: Data Structures and Algorithms
[138] viXra:2207.0150 [pdf] replaced on 2024-04-09 16:21:37
Authors: Sanjeev Saxena
Comments: 9 Pages.
This note describes a very simple O(1) query time algorithm for finding level ancestors. This is basically a serial (re)-implementation of the parallel algorithm of Berkman and Vishkin(O.Berkman and U.Vishkin, Finding level-ancestors in trees, JCSS, 48, 214--230, 1994).Although the basic algorithm has preprocessing time of O(n log n), by having additional levels or using table lookup, the preprocessing time can be reduced to almost linear or linear.The table lookup algorithm can be built in O(1) parallel time with n processors and can also be used to simplify the parallel algorithm of Berkman and Vishkin and make it optimal.
Category: Data Structures and Algorithms
[137] viXra:2206.0100 [pdf] replaced on 2022-06-29 01:41:20
Authors: Jabari Zakiya
Comments: 37 Pages.
This paper explains in detail the math and software comprising the implementation of a fast and efficient Segmented Sieve of Zakiya (SSOZ) to count the number Twin and Cousin Primes within a 64-bit interval, and provide the largest value. Six programming languages implementations are provided, with benchmarks run on 8 and 16 thread systems. The paper provides the details to code it in any language of choice, using the given coded versions as reference implementations.
Category: Data Structures and Algorithms
[136] viXra:2102.0119 [pdf] replaced on 2023-03-19 16:51:25
Authors: Roman Szostek
Comments: 25 Pages. New sports playing system (algorithm) called R-Sport.
The aim of sports competitions is to select the best team, i.e. the champion, from a group of teams or players. Therefore, matches must be played between the individual teams. The results of all matches decide who becomes the champion. The rules of the season form the Sports Playing System. This document describes the new Sports Playing System. This system allows the fair and efficient selection of the winner of the entire season. It has advantages that other well-known and currently used Sports Playing Systems do not have. R-Sport can be used in classic sports as well as in e-sports. R-Sport is a Sports Playing System that allows conducting league matches in many ways.
Category: Data Structures and Algorithms
[135] viXra:2008.0122 [pdf] replaced on 2022-03-22 04:01:19
Authors: James Dow Allen
Comments: 12 Pages.
The theoretical minimum storage needed to represent a set of size N drawn from a universe of size M is about N * (log_2(M/N) + 1.4472) bits (assuming neither N nor M/N is very small). I review the technique of `quotienting' which is used to approach this minimum, and look at the actual memory costs achieved by practical designs. Instead of somehow implementing and exploiting 1.4472 bits of steering information, most practical schemes use two bits (or more). In the conclusion I mention a scheme to reduce the overhead cost from two bits to a single trit.
Category: Data Structures and Algorithms
[134] viXra:2007.0194 [pdf] replaced on 2023-05-10 06:02:23
Authors: Herman Schoenfeld
Comments: 5 Pages.
A very simple modification to the standard W-OTS scheme is presented called W-OTS# that achieves a security enhancement similar to W-OTS+ but without the overhead of hashing a randomization vector in every round of the chaining function. The idea proffered by W-OTS# is to simply thwart Birthday-attacks altogether by signing an HMAC of the message-digest (keyed with cryptographically random salt) rather than the message-digest itself. The signer thwarts a birthday attack by virtue of requiring that the attacker guess the salt bits in addition to the message-digest bits during the collision scanning process. By choosing a salt length matching the message-digest length, the security of W-OTS# reduces to that of the cryptographic hash function. This essentially doubles the security level of W-OTS and facilitates the use of shorter hash functions which provides shorter and faster signatures for same security. For example, W-OTS# 128-bit signatures have commensurate security to standard W-OTS 256-bit signatures yet are roughly half the size and twice as fast. It is proposed that Blake2b-128 and Winternitz parameter w=4 (i.e. base-16 digits) be adopted as the default parameter set for the W-OTS# scheme.
Category: Data Structures and Algorithms
[133] viXra:2007.0063 [pdf] replaced on 2020-08-07 10:34:35
Authors: Karim Baina, Boualem Benatallah
Comments: 20 Pages.
With the disruption produced by extensive automation of automation due to advanced research in machine learning, and auto machine learning, even in programming language translation, the main goal of this paper is to discuss the following question "Is it still worth the cost to teach compiling in 2020 ?". Our paper defends the
"Yes answer" within software engineering majors. The paper also shares the experience of teaching compiling techniques course best practices since more than 15 years, presents and evaluates this experience through Hortensias, a pedagogical compiling laboratory platform providing a language compiler and a virtual machine. Hortensias is a multilingual pedagogical platform for learning end teaching how to build compilers front and back-end. Hortensias language offers the possibility to the programmer to customise the compiler associativity management, visualise the intermediary representations of compiled code, or customise the optimisation management, and the error management language for international students communities. Hortensias offers the
possibility to the beginner programmer to use a graphical user interface to program by clicking. Hortensias compiling pedagogy evaluation has been conducted through two surveys involving in a voluntarily basis engineering students and alumni
during one week. It targeted two null hypothesis : the first null hypothesis supposes that compiling teaching is becoming outdated with regards to current curricula evolution, and the second
null hypothesis supposes Hortensias compiling based pedagogy has no impact neither on understanding nor on implementing compilers and interpreters. During fifteen years of teaching
compiler engineering, Hortensias was a wonderful pedagogic experiment either for teaching and for learning, since vulgarising abstract concepts becomes very easier for teachers, lectures follow
a gamification-like approach, and students become efficient in delivering versions of their compiler software product in a fast pace.
Category: Data Structures and Algorithms
[132] viXra:2006.0245 [pdf] replaced on 2020-06-27 08:38:41
Authors: Dibyendu Baksi
Comments: 7 Pages.
The covid-19 crisis is providing a lot of impetus to the search for innovative technological solutions to solve major problems of tracking and containment of the pandemic. The major cornerstones of testing, isolation, contact tracing and quarantine are well understood and agreed upon at a general level. In this paper, the software architecture for implementing successful automated digital contact tracing applications is elaborated. The goal of contact tracing is to proactively identify the infection chain of the population including asymptomatic people yet to be tested positive, i.e., to avoid asymptomatic people from spreading the disease without any intention. The entire ecosystem of contact tracing is explained so that the real challenges of integrating the key healthcare components are appreciated.
Category: Data Structures and Algorithms
[131] viXra:2003.0039 [pdf] replaced on 2020-03-05 16:48:12
Authors: Elie Duthoo
Comments: 13 Pages. Creative Commons 3.0 Attribution-Noncommercial-ShareAlike license
The Voynich Manuscript (VMS) is an illustrated hand-written document carbon-dated in the early 15th century. This paper aims at providing a statistically robust method for translating voynichese, the language used in the VMS. We will first provide a set of statistical properties that can be applied to any tokenizable language with sub-token elements, apply it to Universal Dependencies (UD) dataset plus VMS (V101 transliteration) to see how it compares to the 157 corpora written in 90 different languages from UD. In a second phase we will provide an algorithm to map characters from one language to characters from another language, and we will apply it to the 158 corpora we have in our possession to measure its quality. We managed to attack more than 60% of UD corpora with this method though results for VMS don't appear to be usable.
Category: Data Structures and Algorithms
[130] viXra:2003.0039 [pdf] replaced on 2020-03-03 06:11:32
Authors: Elie Duthoo
Comments: 13 Pages. Creative Commons 3.0 Attribution-Noncommercial-ShareAlike license
The Voynich Manuscript (VMS) is an illustrated hand-written document carbon-dated in the early 15th century. This paper aims at providing a statistically robust method for translating voynichese, the language used in the VMS. We will first provide a set of statistical properties that can be applied to any tokenizable language with sub-token elements, apply it to Universal Dependencies (UD) dataset plus VMS (V101 transliteration) to see how it compares to the 157 corpora written in 90 different languages from UD. In a second phase we will provide an algorithm to map characters from one language to characters from another language, and we will apply it to the 158 corpora we have in our possession to measure its quality. We managed to attack more than 60% of UD corpora with this method though results for VMS don't appear to be usable.
Category: Data Structures and Algorithms
[129] viXra:2002.0183 [pdf] replaced on 2020-03-09 08:48:47
Authors: Jason Reaves
Comments: 15 Pages.
Obfuscation in malware is commonly employed for any number of reasons but it’s purpose is ultimately the same, to make the underlying malicious entity go unnoticed. Crypters and Packers both are heavily employed to bypass common security measures so ultimately these are just tools. Tools that are utilizing algorithms in order to take data and turn it into some other data while being able to reverse the process later, obviously these reversible algorithms can be chained together as well into ‘layers’. In this paper I explore the idea that it is easier to think of these layers as a math equation which can be solved. This has the potential of turning something that can be overwhelming at first, like writing an unpacker, into a much more manageable problem.
Category: Data Structures and Algorithms