1. |
Nisar F., Rojek J., Nosewicz S., Szczepański J., Kaszyca K.♦, Chmielewski M.♦, Discrete element model for effective electrical conductivity of spark plasma sintered porous materials,
Computational Particle Mechanics, ISSN: 2196-4378, DOI: 10.1007/s40571-024-00773-4, pp.1-11, 2024Abstract: This paper aims to analyse electrical conduction in partially sintered porous materials using an original resistor network model within discrete element framework. The model is based on sintering geometry, where two particles are connected via neck. Particle-to-particle conductance depends on neck size in sintered materials. Therefore, accurate evaluation of neck size is essential to determine conductance. The neck size was determined using volume preservation criterion. Additionally, grain boundary correction factor was introduced to compensate for any non-physical overlaps between particles, particularly at higher densification. Furthermore, grain boundary resistance was added to account for the porosity within necks. For numerical analysis, the DEM sample was generated using real particle size distribution, ensuring a heterogeneous and realistic microstructure characterized by a maximum-to-minimum particle diameter ratio of 15. The DEM sample was subjected to hot press simulation to obtain geometries with different porosity levels. These representative geometries were used to simulate current flow and determine effective electrical conductivity as a function of porosity. The discrete element model (DEM) was validated using experimentally measured electrical conductivities of porous NiAl samples manufactured using spark plasma sintering (SPS). The numerical results were in close agreement with the experimental results, hence proving the accuracy of the model. The model can be used for microscopic analysis and can also be coupled with sintering models to evaluate effective properties during the sintering process. Keywords: Discrete element method, Effective electrical conductivity, Porous materials, Sintering, Resistor network model Affiliations:
Nisar F. | - | IPPT PAN | Rojek J. | - | IPPT PAN | Nosewicz S. | - | IPPT PAN | Szczepański J. | - | IPPT PAN | Kaszyca K. | - | Lukasiewicz Institute of Microelectronics and Photonics (PL) | Chmielewski M. | - | Institute of Electronic Materials Technology (PL) |
| |
2. |
Paprocki B.♦, Pręgowska A., Szczepański J., Does Adding of Neurons to the Network Layer Lead to Increased Transmission Efficiency?,
IEEE Access, ISSN: 2169-3536, DOI: 10.1109/ACCESS.2024.3379324, Vol.12, pp. 42701-42709, 2024Abstract: The aim of this study is to contribute to the important question in Neuroscience of whether the number of neurons in a given layer of a network affects transmission efficiency. Mutual Information, as defined by Shannon, between the input and output signals for certain classes of networks is analyzed theoretically and numerically. A Levy-Baxter probabilistic neural model is applied. This model includes all important qualitative mechanisms involved in the transmission process in the brain. We derived analytical formulas for the Mutual Information of input signals coming from Information Sources as Bernoulli processes. These formulas depend on the parameters of the Information Source, neurons and network. Numerical simulations were performed using these equations. It turned out, that the Mutual Information starting from a certain value increased very slowly with the number of neurons being added. The increase is of the rate m_{−c} where m is the number of neurons in the transmission layer, and c is very small. The calculations also show that for a practical number (up to 15000) of neurons, the Mutual Information reaches only approximately half of the information that is carried out by the input signal. The influence of noise on the transmission efficiency depending on the number of neurons was also analyzed. It turned out that the noise level at which transmission is optimal increases significantly with this number. Our results indicate that a large number of neurons in the network does not mean an essential improvement in transmission efficiency, but can contribute to reliability. Keywords: Shannon communication theory,neural network,network layer,transmission efficiency,mutual information,model of neuron,spike trains,information source,entropy Affiliations:
Paprocki B. | - | Kazimierz Wielki University (PL) | Pręgowska A. | - | IPPT PAN | Szczepański J. | - | IPPT PAN |
| |
3. |
Rudnicka Z., Szczepański J., Pręgowska A., Artificial Intelligence-Based Algorithms in Medical Image Scan Segmentation and Intelligent Visual Content Generation—A Concise Overview,
Electronics , ISSN: 2079-9292, DOI: 10.3390/electronics13040746, Vol.13, No.4, pp.1-35, 2024Abstract: Recently, artificial intelligence (AI)-based algorithms have revolutionized the medical image segmentation processes. Thus, the precise segmentation of organs and their lesions may contribute to an efficient diagnostics process and a more effective selection of targeted therapies, as well as increasing the effectiveness of the training process. In this context, AI may contribute to the automatization of the image scan segmentation process and increase the quality of the resulting 3D objects, which may lead to the generation of more realistic virtual objects. In this paper, we focus on the AI-based solutions applied in medical image scan segmentation and intelligent visual content generation, i.e., computer-generated three-dimensional (3D) images in the context of extended reality (XR). We consider different types of neural networks used with a special emphasis on the learning rules applied, taking into account algorithm accuracy and performance, as well as open data availability. This paper attempts to summarize the current development of AI-based segmentation methods in medical imaging and intelligent visual content generation that are applied in XR. It concludes with possible developments and open challenges in AI applications in extended reality-based solutions. Finally, future lines of research and development directions of artificial intelligence applications, both in medical image segmentation and extended reality-based medical solutions, are discussed. Keywords: artificial intelligence, extended reality, medical image scan segmentation Affiliations:
Rudnicka Z. | - | IPPT PAN | Szczepański J. | - | IPPT PAN | Pręgowska A. | - | IPPT PAN |
| |
4. |
Garlinska M.♦, Pręgowska A., Gutowska I.♦, Osial M.♦, Szczepański J., Experimental study of the free space optics communication system operating in the 8–12 μm spectral range,
Electronics , ISSN: 2079-9292, DOI: 10.3390/electronics10080875, Vol.10, No.8, pp.875-1-13, 2021Abstract: (1) Background: Free space optics communication (FSO) has improved wireless communication and data transfer thanks to high bandwidth, low power consumption, energy efficiency, a high transfer capacity, and a wide applicability field. The FSO systems also have their limitations, including weather conditions and obstacles in the way of transmission. (2) Methods: This research assesses the atmospheric conditions’ influence on the intensity of received radiation, both experimentally and theoretically. The construction of a laboratory test stand of the FSO system, which is operating in the third-atmosphere transmission window (8–12 μm), is proposed. Next, considering different atmospheric conditions, the experimental validation was conducted, both in a laboratory and real conditions. (3) Results: The measurements were carried out for two optical links working with wavelengths of 1.5 μm and 10 μm. It was found that optical radiation with a wavelength of about 10 μm is characterized by better transmission properties in the case of limited visibility (e.g.,light rain and fogs) than in the case of near-infrared waves. The same conclusion was found in analytical investigations. (4) Conclusions: The results obtained show that optical radiation with a wavelength of about 10 μm in limited visibility is characterized by better transmission properties than near-infrared waves. This demonstrates the validity of designing FSO links operating in the range 8–12 μm band, e.g., based on quantum cascade lasers and HgCdTe photodiodes. Keywords: free space optical communication, IR photodetector, quantum cascade laser, wireless communication Affiliations:
Garlinska M. | - | National Center for Research and Development (PL) | Pręgowska A. | - | IPPT PAN | Gutowska I. | - | Oregon State University (US) | Osial M. | - | other affiliation | Szczepański J. | - | IPPT PAN |
| |
5. |
Szczepański J., Klamka J.♦, Węgrzyn-Wolska K.♦, Rojek I.♦, Prokopowicz P.♦, Computational intelligence and optimization techniques in communications and control,
BULLETIN OF THE POLISH ACADEMY OF SCIENCES: TECHNICAL SCIENCES, ISSN: 0239-7528, DOI: 10.24425/bpasts.2020.131851, Vol.68, No.2, pp.181-184, 2020, EDITORIALAbstract: The objective of this Special Section on Computational Intelligence and Optimization Techniques in Communications and Control is to present recent advances and applications of Ordered Fuzzy Numbers (OFNs), computational intelligence (CI) and other optimization methods for communication security, assessing the effectiveness of routing techniques in wireless networks, automatic sound recognition, the analysis of brain signals, routing and switching in optical networks, and process control. Affiliations:
Szczepański J. | - | IPPT PAN | Klamka J. | - | Institute of Theoretical and Applied Informatics, Polish Academy of Sciences (PL) | Węgrzyn-Wolska K. | - | École d'Ingénieurs Généraliste du Numérique (FR) | Rojek I. | - | Kazimierz Wielki University (PL) | Prokopowicz P. | - | Kazimierz Wielki University (PL) |
| |
6. |
Paprocki B.♦, Pręgowska A., Szczepański J., Optimizing information processing in brain-inspired neural networks,
BULLETIN OF THE POLISH ACADEMY OF SCIENCES: TECHNICAL SCIENCES, ISSN: 0239-7528, DOI: 10.24425/bpasts.2020.131844, Vol.68, No.2, pp.225-233, 2020Abstract: The way brain networks maintain high transmission efficiency is believed to be fundamental in understanding brain activity. Brains consisting of more cells render information transmission more reliable and robust to noise. On the other hand, processing information in larger networks requires additional energy. Recent studies suggest that it is complexity, connectivity, and function diversity, rather than just size and the number of neurons, that could favour the evolution of memory, learning, and higher cognition. In this paper, we use Shannon information theory to address transmission efficiency quantitatively. We describe neural networks as communication channels, and then we measure information as mutual information between stimuli and network responses. We employ a probabilistic neuron model based on the approach proposed by Levy and Baxter, which comprises essential qualitative information transfer mechanisms. In this paper, we overview and discuss our previous quantitative results regarding brain-inspired networks, addressing their qualitative consequences in the context of broader literature. It is shown that mutual information is often maximized in a very noisy environment e.g., where only one-third of all input spikes are allowed to pass through noisy synapses and farther into the network. Moreover, we show that inhibitory connections as well as properly displaced long-range connections often significantly improve transmission efficiency. A deep understanding of brain processes in terms of advanced mathematical science plays an important role in the explanation of the nature of brain efficiency. Our results confirm that basic brain components that appear during the evolution process arise to optimise transmission performance. Keywords: neural network, entropy, mutual information, noise, inhibitory neuron Affiliations:
Paprocki B. | - | Kazimierz Wielki University (PL) | Pręgowska A. | - | IPPT PAN | Szczepański J. | - | IPPT PAN |
| |
7. |
Pręgowska A., Kaplan E.♦, Szczepański J., How far can neural correlations reduce uncertainty? Comparison of information transmission rates for Markov and Bernoulli processes,
International Journal of Neural Systems, ISSN: 0129-0657, DOI: 10.1142/S0129065719500035, Vol.29, No.8, pp.1950003-1-13, 2019Abstract: The nature of neural codes is central to neuroscience. Do neurons encode information through relatively slow changes in the firing rates of individual spikes (rate code) or by the precise timing of every spike (temporal code)? Here we compare the loss of information due to correlations for these two possible neural codes. The essence of Shannon's definition of information is to combine information with uncertainty: the higher the uncertainty of a given event, the more information is conveyed by that event. Correlations can reduce uncertainty or the amount of information, but by how much? In this paper we address this question by a direct comparison of the information per symbol conveyed by the words coming from a binary Markov source (temporal code) with the information per symbol coming from the corresponding Bernoulli source (uncorrelated, rate code). In a previous paper we found that a crucial role in the relation between information transmission rates (ITRs) and firing rates is played by a parameter s, which is the sum of transition probabilities from the no-spike state to the spike state and vice versa. We found that in this case too a crucial role is played by the same parameter s. We calculated the maximal and minimal bounds of the quotient of ITRs for these sources. Next, making use of the entropy grouping axiom, we determined the loss of information in a Markov source compared with the information in the corresponding Bernoulli source for a given word length. Our results show that in the case of correlated signals the loss of information is relatively small, and thus temporal codes, which are more energetically efficient, can replace rate codes effectively. These results were confirmed by experiments. Keywords: Shannon information theory, information source, information transmission rate, firing rate, neural coding Affiliations:
Pręgowska A. | - | IPPT PAN | Kaplan E. | - | Icahn School of Medicine at Mount Sinai (US) | Szczepański J. | - | IPPT PAN |
| |
8. |
Błoński S., Pręgowska A., Michałek T., Szczepański J., The use of Lempel-Ziv complexity to analyze turbulence and flow randomness based on velocity fluctuations,
BULLETIN OF THE POLISH ACADEMY OF SCIENCES: TECHNICAL SCIENCES, ISSN: 0239-7528, DOI: 10.24425/bpasts.2019.130876, Vol.67, No.5, pp.957-962, 2019Abstract: One of the mathematical tools to measure the generation rate of new patterns along a sequence of symbols is the Lempel-Ziv complexity (LZ). Under additional assumptions, LZ is an estimator of entropy in the Shannon sense. Since entropy is considered as a measure of randomness, this means that LZ can be treated also as a randomness indicator. In this paper, we used LZ concept to the analysis of different flow regimes in cold flow combustor models. Experimental data for two combustor's configurations motivated by efficient mixing need were considered. Extensive computer analysis was applied to develop a complexity approach to the analysis of velocity fluctuations recorded with hot-wire anemometry and PIV technique. A natural encoding method to address these velocity fluctuations was proposed. It turned out, that with this encoding the complexity values of the sequences are well correlated with the values obtained by means of RMS method (larger/smaller complexity larger/smaller RMS). However, our calculations pointed out the interesting result that most complex, this means most random, behavior does not overlap with the "most turbulent" point determined by the RMS method, but it is located in the point with maximal average velocity. It seems that complexity method can be particularly useful to analyze turbulent and unsteady flow regimes. Moreover, the complexity can also be used to establish other flow characteristics like its ergodicity or mixing. Keywords: turbulence, complexity, entropy, randomness Affiliations:
Błoński S. | - | IPPT PAN | Pręgowska A. | - | IPPT PAN | Michałek T. | - | IPPT PAN | Szczepański J. | - | IPPT PAN |
| |
9. |
Pręgowska A., Proniewska K.♦, van Dam P.♦, Szczepański J., Using Lempel-Ziv complexity as effective classification tool of the sleep-related breathing disorders,
Computer Methods and Programs in Biomedicine, ISSN: 0169-2607, DOI: 10.1016/j.cmpb.2019.105052, Vol.182, pp.105052-1-7, 2019Abstract: Background and objective: People suffer from sleep disorders caused by work-related stress, irregular lifestyle or mental health problems. Therefore, development of effective tools to diagnose sleep disorders is important. Recently, to analyze biomedical signals Information Theory is exploited. We propose efficient classification method of sleep anomalies by applying entropy estimating algorithms to encoded ECGs signals coming from patients suffering from Sleep-Related Breathing Disorders (SRBD). Methods: First, ECGs were discretized using the encoding method which captures the biosignals variability. It takes into account oscillations of ECG measurements around signals averages. Next, to estimate entropy of encoded signals Lempel–Ziv complexity algorithm (LZ) which measures patterns generation rate was applied. Then, optimal encoding parameters, which allow distinguishing normal versus abnormal events during sleep with high sensitivity and specificity were determined numerically. Simultaneously, subjects' states were identified using acoustic signal of breathing recorded in the same period during sleep. Results: Random sequences show normalized LZ close to 1 while for more regular sequences it is closer to 0. Our calculations show that SRBDs have normalized LZ around 0.32 (on average), while control group has complexity around 0.85. The results obtained to public database are similar, i.e. LZ for SRBDs around 0.48 and for control group 0.7. These show that signals within the control group are more random whereas for the SRBD group ECGs are more deterministic. This finding remained valid for both signals acquired during the whole duration of experiment, and when shorter time intervals were considered. Proposed classifier provided sleep disorders diagnostics with a sensitivity of 93.75 and specificity of 73.00%. To validate our method we have considered also different variants as a training and as testing sets. In all cases, the optimal encoding parameter, sensitivity and specificity values were similar to our results above. Conclusions: Our pilot study suggests that LZ based algorithm could be used as a clinical tool to classify sleep disorders since the LZ complexities for SRBD positives versus healthy individuals show a significant difference. Moreover, normalized LZ complexity changes are related to the snoring level. This study also indicates that LZ technique is able to detect sleep abnormalities in early disorders stage. Keywords: information theory, Lempel-Ziv complexity, entropy, ECG, sleep-related breathing disorders, randomness Affiliations:
Pręgowska A. | - | IPPT PAN | Proniewska K. | - | Jagiellonian University (PL) | van Dam P. | - | PEACS BV, Nieuwerbrug (NL) | Szczepański J. | - | IPPT PAN |
| |
10. |
Pręgowska A., Casti A.♦, Kaplan E.♦, Wajnryb E., Szczepański J., Information processing in the LGN: a comparison of neural codes and cell types,
BIOLOGICAL CYBERNETICS, ISSN: 0340-1200, DOI: 10.1007/s00422-019-00801-0, Vol.113, No.4, pp.453-464, 2019Abstract: To understand how anatomy and physiology allow an organism to perform its function, it is important to know how information that is transmitted by spikes in the brain is received and encoded. A natural question is whether the spike rate alone encodes the information about a stimulus (rate code), or additional information is contained in the temporal pattern of the spikes (temporal code). Here we address this question using data from the cat Lateral Geniculate Nucleus (LGN), which is the visual portion of the thalamus, through which visual information from the retina is communicated to the visual cortex. We analyzed the responses of LGN neurons to spatially homogeneous spots of various sizes with temporally random luminance modulation. We compared the Firing Rate with the Shannon Information Transmission Rate, which quantifies the information contained in the temporal relationships between spikes. We found that the behavior of these two rates can differ quantitatively. This suggests that the energy used for spiking does not translate directly into the information to be transmitted. We also compared Firing Rates with Information Rates for X-ON and X-OFF cells. We found that, for X-ON cells the Firing Rate and Information Rate often behave in a completely different way, while for X-OFF cells these rates are much more highly correlated. Our results suggest that for X-ON cells a more efficient "temporal code" is employed, while for X-OFF cells a straightforward "rate code" is used, which is more reliable and is correlated with energy consumption. Keywords: Shannon information theory, cat LGN, ON–OFF cells, neural coding, entropy, firing rate Affiliations:
Pręgowska A. | - | IPPT PAN | Casti A. | - | Fairleigh Dickinson University (US) | Kaplan E. | - | Icahn School of Medicine at Mount Sinai (US) | Wajnryb E. | - | IPPT PAN | Szczepański J. | - | IPPT PAN |
| |
11. |
Pręgowska A., Szczepański J., Wajnryb E., Temporal code versus rate code for binary Information Sources,
NEUROCOMPUTING, ISSN: 0925-2312, DOI: 10.1016/j.neucom.2016.08.034, Vol.216, pp.756-762, 2016Abstract: Neuroscientists formulate very different hypotheses about the nature of neural coding. At one extreme, it has been argued that neurons encode information through relatively slow changes in the arrival rates of individual spikes (rate codes) and that the irregularity in the spike trains reflects the noise in the system. At the other extreme, this irregularity is the code itself (temporal codes) so that the precise timing of every spike carries additional information about the input. It is well known that in the estimation of Shannon Information Transmission Rate, the patterns and temporal structures are taken into account, while the “rate code” is already determined by the firing rate, i.e. by the spike frequency. In this paper we compare these two types of codes for binary Information Sources, which model encoded spike trains. Assuming that the information transmitted by a neuron is governed by an uncorrelated stochastic process or by a process with a memory, we compare the Information Transmission Rates carried by such spike trains with their firing rates. Here we show that a crucial role in the relation between information transmission and firing rates is played by a factor that we call the “jumping” parameter. This parameter corresponds to the probability of transitions from the no-spike-state to the spike-state and vice versa. For low jumping parameter values, the quotient of information and firing rates is a monotonically decreasing function of the firing rate, and there therefore a straightforward, one-to-one, relation between temporal and rate codes. However, it turns out that for large enough values of the jumping parameter this quotient is a non-monotonic function of the firing rate and it exhibits a global maximum, so that in this case there is an optimal firing rate. Moreover, there is no one-to-one relation between information and firing rates, so the temporal and rate codes differ qualitatively. This leads to the observation that the behavior of the quotient of information and firing rates for a large jumping parameter value is especially important in the context of bursting phenomena. Keywords: Information Theory, Information Source, Stochastic process, Information transmission rate, Firing rate Affiliations:
Pręgowska A. | - | IPPT PAN | Szczepański J. | - | IPPT PAN | Wajnryb E. | - | IPPT PAN |
| |
12. |
Pręgowska A., Szczepański J., Wajnryb E., Mutual information against correlations in binary communication channels,
BMC NEUROSCIENCE, ISSN: 1471-2202, DOI: 10.1186/s12868-015-0168-0, Vol.16, No.32, pp.1-7, 2015Abstract: Background
Explaining how the brain processing is so fast remains an open problem (van Hemmen JL, Sejnowski T., 2004). Thus, the analysis of neural transmission (Shannon CE, Weaver W., 1963) processes basically focuses on searching for effective encoding and decoding schemes. According to the Shannon fundamental theorem, mutual information plays a crucial role in characterizing the efficiency of communication channels. It is well known that this efficiency is determined by the channel capacity that is already the maximal mutual information between input and output signals. On the other hand, intuitively speaking, when input and output signals are more correlated, the transmission should be more efficient. A natural question arises about the relation between mutual information and correlation. We analyze the relation between these quantities using the binary representation of signals, which is the most common approach taken in studying neuronal processes of the brain.
Results
We present binary communication channels for which mutual information and correlation coefficients behave differently both quantitatively and qualitatively. Despite this difference in behavior, we show that the noncorrelation of binary signals implies their independence, in contrast to the case for general types of signals.
Conclusions
Our research shows that the mutual information cannot be replaced by sheer correlations. Our results indicate that neuronal encoding has more complicated nature which cannot be captured by straightforward correlations between input and output signals once the mutual information takes into account the structure and patterns of the signals. Keywords: Shannon information, Communication channel, Entropy, Mutual information, Correlation, Neuronal encoding Affiliations:
Pręgowska A. | - | IPPT PAN | Szczepański J. | - | IPPT PAN | Wajnryb E. | - | IPPT PAN |
| |
13. |
Arnold M.M.♦, Szczepański J., Montejo N.♦, Amigó J.M.♦, Wajnryb E., Sanchez-Vives M.V.♦, Information content in cortical spike trains during brain state transitions,
JOURNAL OF SLEEP RESEARCH, ISSN: 0962-1105, DOI: 10.1111/j.1365-2869.2012.01031.x, Vol.22, pp.13-21, 2013Abstract: Even in the absence of external stimuli there is ongoing activity in the cerebral cortex as a result of recurrent connectivity. This paper attempts to characterize one aspect of this ongoing activity by examining how the information content carried by specific neurons varies as a function of brain state. We recorded from rats chronically implanted with tetrodes in the primary visual cortex during awake and sleep periods. Electro-encephalogram and spike trains were recorded during 30-min periods, and 2–4 neuronal spikes were isolated per tetrode off-line. All the activity included in the analysis was spontaneous, being recorded from the visual cortex in the absence of visual stimuli. The brain state was determined through a combination of behavior evaluation, electroencephalogram and electromyogram analysis. Information in the spike trains was determined by using Lempel–Ziv Complexity. Complexity was used to estimate the entropy of neural discharges and thus the information content (Amigóet al. Neural Comput., 2004, 16: 717–736). The information content in spike trains (range 4–70 bits s−1) was evaluated during different brain states and particularly during the transition periods. Transitions toward states of deeper sleep coincided with a decrease of information, while transitions to the awake state resulted in an increase in information. Changes in both directions were of the same magnitude, about 30%. Information in spike trains showed a high temporal correlation between neurons, reinforcing the idea of the impact of the brain state in the information content of spike trains. Keywords: awake, brain states, entropy, firing rate, information, sleep, spike train Affiliations:
Arnold M.M. | - | Universidad Miguel Hernández-CSIC (ES) | Szczepański J. | - | IPPT PAN | Montejo N. | - | Universidad Miguel Hernández-CSIC (ES) | Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Wajnryb E. | - | IPPT PAN | Sanchez-Vives M.V. | - | ICREA-IDIBAPS (ES) |
| |
14. |
Paprocki B.♦, Szczepański J., How do the amplitude fluctuations affect the neuronal transmission efficiency,
NEUROCOMPUTING, ISSN: 0925-2312, DOI: 10.1016/j.neucom.2012.11.001, Vol.104, pp.50-56, 2013Abstract: Tremendous effort is done to understand the nature of neuronal coding, its high efficiency and mechanisms governing it. Paprocki and Szczepanski [13] explored the model of neuron proposed by Levy and Baxter [12] and analyzed the efficiency with respect to the synaptic failure, activation threshold, firing rate and type of the input source. In this paper we study influence of the amplitude fluctuations (damping, uniform and amplifying), another important component in neuronal computations. The efficiency is understood in the sense of mutual information (between input and output signals), which constitutes the fundamental concept of the Shannon communication theory. Using high quality entropy estimators we determined maximal values of the mutual information between input and output neuronal signals for simple neuronal architecture. We observed that this maximal efficiency remains nearly constant, almost regardless of the fluctuation type. We have also found that for wide range of thresholds, both for damping and amplifying fluctuations, the mutual information behaves in an opposite way to the corresponding correlations between input and output signals. These calculations confirm that the neuronal coding is much more subtle than the straightforward intuitive optimization of input–output correlations. Keywords: Neural computation, Mutual information, Amplitude fluctuation, Activation threshold, Synaptic failure, Entropy estimation Affiliations:
Paprocki B. | - | Kazimierz Wielki University (PL) | Szczepański J. | - | IPPT PAN |
| |
15. |
Paprocki B.♦, Szczepański J., Transmission efficiency in ring, brain inspired neuronal networks. Informationand energetic aspects,
Brain Research, ISSN: 0006-8993, DOI: 10.1016/j.brainres.2013.07.024, Vol.1536, pp.135-143, 2013Abstract: Organisms often evolve as compromises, and many of these compromises can be expressed in terms of energy efficiency. Thus, many authors analyze energetic costs processes during information transmission in the brain. In this paper we study information transmission rate per energy used in a class of ring, brain inspired neural networks, which we assume to involve components like excitatory and inhibitory neurons or long-range connections. Choosing model of neuron we followed a probabilistic approach proposed by Levy and Baxter (2002), which contains all essential qualitative mechanisms participating in the transmission process and provides results consistent with physiologically observed values.
Our research shows that all network components, in broad range of conditions, significantly improve the information-energetic efficiency. It turned out that inhibitory neurons can improve the information-energetic transmission efficiency by 50%, while long-range connections can improve the efficiency even by 70%. We also found that the most effective is the network with the smallest size: we observed that two times increase of the size can cause even three times decrease of the information-energetic efficiency. Keywords: Information transmission efficiency, Mutual information, Brain inspired network, Inhibitory neuron, Long-range connection, Neuronal computation Affiliations:
Paprocki B. | - | Kazimierz Wielki University (PL) | Szczepański J. | - | IPPT PAN |
| |
16. |
Paprocki B.♦, Szczepański J., Kołbuk D., Information transmission efficiency in neuronal communication systems,
BMC NEUROSCIENCE, ISSN: 1471-2202, DOI: 10.1186/1471-2202-14-S1-P217, Vol.14(Suppl 1), No.P217, pp.1-2, 2013Abstract: The nature and efficiency of brain transmission pro-cesses, its high reliability and efficiency is one of the most elusive area of contemporary science [1]. We study information transmission efficiency by considering a neuronal communication as a Shannon-type channel. Thus, using high quality entropy estimators, we evaluate the mutual information between input and output signals. We assume model of neuron proposed by Levy and Baxter [2], which incorporates all essential qualitative mechanisms participating in neural transmission process. Keywords: transmission efficiency, neuronal communication, Shannon-type channe Affiliations:
Paprocki B. | - | Kazimierz Wielki University (PL) | Szczepański J. | - | IPPT PAN | Kołbuk D. | - | IPPT PAN |
| |
17. |
Paprocki B.♦, Szczepański J., Efficiency of neural transmission as a function of synaptic noise, threshold, and source characteristics,
BIOSYSTEMS, ISSN: 0303-2647, DOI: 10.1016/j.biosystems.2011.03.005, Vol.105, pp.62-72, 2011Abstract: There has been a growing interest in the estimation of information carried by a single neuron and multiple single units or population of neurons to specific stimuli. In this paper we analyze, inspired by article of Levy and Baxter (2002), the efficiency of a neuronal communication by considering dendrosomatic summation as a Shannon-type channel (1948) and by considering such uncertain synaptic transmission as part of the dendrosomatic computation. Specifically, we study Mutual Information between input and output signals for different types of neuronal network architectures by applying efficient entropy estimators. We analyze the influence of the following quantities affecting transmission abilities of neurons: synaptic failure, activation threshold, firing rate and type of the input source. We observed a number of surprising non-intuitive effects. It turns out that, especially for lower activation thresholds, significant synaptic noise can lead even to twofold increase of the transmission efficiency. Moreover, the efficiency turns out to be a non-monotonic function of the activation threshold. We find a universal value of threshold for which a local maximum of Mutual Information is achieved for most of the neuronal architectures, regardless of the type of the source (correlated and non-correlated). Additionally, to reach the global maximum the optimal firing rates must increase with the threshold. This effect is particularly visible for lower firing rates. For higher firing rates the influence of synaptic noise on the transmission efficiency is more advantageous. Noise is an inherent component of communication in biological systems, hence, based on our analysis, we conjecture that the neuronal architecture was adjusted to make more effective use of this attribute. Keywords: Neuronal computation, Entropy, Mutual Information, Estimators, Neuron, Quantal failure, Activation threshold, Neural network Affiliations:
Paprocki B. | - | Kazimierz Wielki University (PL) | Szczepański J. | - | IPPT PAN |
| |
18. |
Szczepański J., Arnold M.♦, Wajnryb E., Amigó J.M.♦, Sanchez-Vives M.V.♦, Mutual information and redundancy in spontaneous communication between cortical neurons,
BIOLOGICAL CYBERNETICS, ISSN: 0340-1200, DOI: 10.1007/s00422-011-0425-y, Vol.104, pp.161-174, 2011Abstract: An important question in neural information processing is how neurons cooperate to transmit information. To study this question, we resort to the concept of redundancy in the information transmitted by a group of neurons and, at the same time, we introduce a novel concept for measuring cooperation between pairs of neurons called relative mutual information (RMI). Specifically, we studied these two parameters for spike trains generated by neighboring neurons from the primary visual cortex in the awake, freely moving rat. The spike trains studied here were spontaneously generated in the cortical network, in the absence of visual stimulation. Under these conditions, our analysis revealed that while the value of RMI oscillated slightly around an average value, the redundancy exhibited a behavior characterized by a higher variability. We conjecture that this combination of approximately constant RMI and greater variable redundancy makes information transmission more resistant to noise disturbances. Furthermore, the redundancy values suggest that neurons can cooperate in a flexible way during information transmission. This mostly occurs via a leading neuron with higher transmission rate or, less frequently, through the information rate of the whole group being higher than the sum of the individual information rates—in other words in a synergetic manner. The proposed method applies not only to the stationary, but also to locally stationary neural signals. Keywords: Neurons, Shannon information, Entropy, Mutual information, Redundancy, Visual cortex, Spikes train, Spontaneous activity Affiliations:
Szczepański J. | - | IPPT PAN | Arnold M. | - | Universidad Miguel Hernández-CSIC (ES) | Wajnryb E. | - | IPPT PAN | Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Sanchez-Vives M.V. | - | ICREA-IDIBAPS (ES) |
| |
19. |
Szczepański J., On the distribution function of the complexity of finite sequences,
INFORMATION SCIENCES, ISSN: 0020-0255, DOI: 10.1016/j.ins.2008.12.019, Vol.179, pp.1217-1220, 2009Abstract: Investigations of complexity of sequences lead to important applications such as effective data compression, testing of randomness, discriminating between information sources and many others. In this paper we establish formulae describing the distribution functions of random variables representing the complexity of finite sequences introduced by Lempel and Ziv in 1976. It is known that this quantity can be used as an estimator of entropy. We show that the distribution functions depend affinely on the probabilities of the so-called ‘‘exact” sequences. Keywords: Lempel–Ziv complexity, Distribution function, Randomness Affiliations:
Szczepański J. | - | IPPT PAN |
| |
20. |
Amigó J.M.♦, Kocarev L.♦, Szczepański J., On some properties of the discrete Lyapunov exponent,
PHYSICS LETTERS A, ISSN: 0375-9601, DOI: 10.1016/j.physleta.2008.07.076, Vol.372, pp.6265-6268, 2008Abstract: One of the possible by-products of discrete chaos is the application of its tools, in particular of the discrete Lyapunov exponent, to cryptography. In this Letter we explore this question in a very general setting. Affiliations:
Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Kocarev L. | - | University “Kiril i Metodij” (MK) | Szczepański J. | - | IPPT PAN |
| |
21. |
Nagarajan R.♦, Szczepański J., Wajnryb E., Interpreting non-random signatures in biomedical signals with Lempel-Ziv complexity,
PHYSICA D-NONLINEAR PHENOMENA, ISSN: 0167-2789, DOI: 10.1016/j.physd.2007.09.007, Vol.237, pp.359-364, 2008Abstract: Lempel–Ziv complexity (LZ) [J. Ziv, A. Lempel, On the complexity of finite sequences, IEEE Trans. Inform. Theory 22 (1976) 75–81] and its variants have been used widely to identify non-random patterns in biomedical signals obtained across distinct physiological states. Non-random signatures of the complexity measure can occur under nonlinear deterministic as well as non-deterministic settings. Surrogate data testing have also been encouraged in the past in conjunction with complexity estimates to make a finer distinction between various classes of processes. In this brief letter, we make two important observations (1) Non-Gaussian noise at the dynamical level can elude existing surrogate algorithms namely: Phase-randomized surrogates (FT) amplitude-adjusted Fourier transform (AAFT) and iterated amplitude-adjusted Fourier transform (IAAFT). Thus any inference nonlinear determinism as an explanation for the non-randomness is incomplete (2) Decrease in complexity can be observed even across two linear processes with identical auto-correlation functions. The results are illustrated with a second-order auto-regressive process with Gaussian and non-Gaussian innovations. AR(2) processes have been used widely to model several physiological phenomena, hence their choice. The results presented encou rage cautious interpretation of non-random signatures in experimental signals using complexity measures. Keywords: Lempel–Ziv complexity, Surrogate testing, Auto-regressive process Affiliations:
Nagarajan R. | - | other affiliation | Szczepański J. | - | IPPT PAN | Wajnryb E. | - | IPPT PAN |
| |
22. |
Amigó J.M.♦, Kocarev L.♦, Szczepański J., Theory and practice of chaotic cryptography,
PHYSICS LETTERS A, ISSN: 0375-9601, DOI: 10.1016/j.physleta.2007.02.021, Vol.366, pp.211-216, 2007Abstract: In this Letter we address some basic questions about chaotic cryptography, not least the very definition of chaos in discrete systems. We propose a conceptual framework and illustrate it with different examples from private and public key cryptography. We elaborate also on possible limits of chaotic cryptography. Affiliations:
Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Kocarev L. | - | University “Kiril i Metodij” (MK) | Szczepański J. | - | IPPT PAN |
| |
23. |
Amigó J.M.♦, Kocarev L.♦, Szczepański J., Discrete Lyapunov exponent and resistance to differential cryptanalysis,
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS II-EXPRESS BRIEFS, ISSN: 1549-7747, DOI: 10.1109/TCSII.2007.901576, Vol.54, No.10, pp.882-886, 2007Abstract: In a recent paper, Jakimoski and Subbalakshmi provided a nice connection between the so-called discrete Lyapunov exponent of a permutation F defined on a finite lattice and its maximal differential probability, a parameter that measures the complexity of a differential cryptanalysis attack on the substitution defined by F. In this brief, we take a second look at their result to find some practical shortcomings. We also discuss more general aspects. Keywords: Differential cryptanalysis, discrete Lyapunov exponent (DLE), maximum differential probability (DP) Affiliations:
Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Kocarev L. | - | University “Kiril i Metodij” (MK) | Szczepański J. | - | IPPT PAN |
| |
24. |
Amigó J.M.♦, Kocarev L.♦, Szczepański J., Order patterns and chaos,
PHYSICS LETTERS A, ISSN: 0375-9601, DOI: 10.1016/j.physleta.2006.01.093, Vol.355, pp.27-31, 2006Abstract: Chaotic maps can mimic random behavior in a quite impressive way. In particular, those possessing a generating partition can produce any symbolic sequence by properly choosing the initial state. We study in this Letter the ability of chaotic maps to generate order patterns and come to the conclusion that their performance in this respect falls short of expectations. This result reveals some basic limitation of a deterministic dynamic as compared to a random one. This being the case, we propose a non-statistical test based on ‘forbidden’ order patterns to discriminate chaotic from truly random time series with, in principle, arbitrarily high probability. Some relations with discrete chaos and chaotic cryptography are also discussed. Keywords: Chaotic maps, Order patterns, Permutation entropy, Discrete Lyapunov exponent, Chaotic cryptography Affiliations:
Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Kocarev L. | - | University “Kiril i Metodij” (MK) | Szczepański J. | - | IPPT PAN |
| |
25. |
Kocarev L.♦, Szczepański J., Amigó J.M.♦, Tomovski I.♦, Discrete chaos - I: theory,
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, ISSN: 1549-8328, DOI: 10.1109/TCSI.2006.874181, Vol.53, No.6, pp.1300-1309, 2006Abstract: We propose a definition of the discrete Lyapunov exponent for an arbitrary permutation of a finite lattice. For discrete-time dynamical systems, it measures the local (between neighboring points) average spreading of the system. We justify our definition by proving that, for large classes of chaotic maps, the corresponding discrete Lyapunov exponent approaches the largest Lyapunov exponent of a chaotic map when Mrarrinfin, where M is the cardinality of the discrete phase space. In analogy with continuous systems, we say the system has discrete chaos if its discrete Lyapunov exponent tends to a positive number, when Mrarrinfin. We present several examples to illustrate the concepts being introduced. Keywords: Chaos, discrete chaos, Lyapunov components Affiliations:
Kocarev L. | - | University “Kiril i Metodij” (MK) | Szczepański J. | - | IPPT PAN | Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Tomovski I. | - | other affiliation |
| |
26. |
Amigó J.M.♦, Szczepański J., Kocarev L.♦, A chaos-based approach to the design of cryptographically secure substitutions,
PHYSICS LETTERS A, ISSN: 0375-9601, DOI: 10.1016/j.physleta.2005.05.057, Vol.343, pp.55-60, 2005Abstract: We show that chaotic maps may be used for designing so-called substitution boxes for ciphers resistant to linear and differential cryptanalysis, providing an alternative to the algebraic methods. Our approach is based on the approximation of mixing maps by periodic transformations. The expectation behind is, of course, that the nice chaotic properties of such maps will be inherited by their approximations, at least if the convergence rate is appropriate and the associated partitions are sufficiently fine. We show that this is indeed the case and that, in principle, substitutions with close-to-optimal immunity to linear and differential cryptanalysis can be designed along these guidelines. We provide also practical examples and numerical evidence for this approximation philosophy Keywords: Chaotic maps, Periodic approximations, Bit permutations, Cryptanalysis Affiliations:
Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Szczepański J. | - | IPPT PAN | Kocarev L. | - | University “Kiril i Metodij” (MK) |
| |
27. |
Szczepański J., Amigó J.M.♦, Michałek T., Kocarev L.♦, Cryptographically secure substitutions based on the approximation of mixing maps,
IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS I-REGULAR PAPERS, ISSN: 1549-8328, DOI: 10.1109/TCSI.2004.841602, Vol.52, No.2, pp.443-453, 2005Abstract: In this paper, we explore, following Shannon’s suggestion that diffusion should be one of the ingredients of resistant block ciphers, the feasibility of designing cryptographically secure substitutions (think of S-boxes, say) via approximation of mixing maps by periodic transformations. The expectation behind this approach is, of course, that the nice diffusion properties of such maps will be inherited by their approximations, at least if the convergence rate is appropriate and the associated partitions are sufficiently fine. Our results show that this is indeed the case and that, in principle, block ciphers with close-to-optimal immunity to linear and differential cryptanalysis (as measured by the linear and differential approximation probabilities) can be designed along these guidelines. We provide also practical examples and numerical evidence for this approximation philosophy. Keywords: Black cipher, differential cryptanalysis, linear cryptanalysis, mixing dynamical system, periodic approximation, S box Affiliations:
Szczepański J. | - | IPPT PAN | Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Michałek T. | - | IPPT PAN | Kocarev L. | - | University “Kiril i Metodij” (MK) |
| |
28. |
Kocarev L.♦, Szczepański J., Finite-space Lyapunov exponents and pseudochaos,
PHYSICAL REVIEW LETTERS, ISSN: 0031-9007, DOI: 10.1103/PhysRevLett.93.234101, Vol.93, pp.234101-1-4, 2004Abstract: We propose a definition of finite-space Lyapunov exponent. For discrete-time dynamical systems, it measures the local (between neighboring points) average spreading of the system. We justify our definition by showing that, for large classes of chaotic maps, the corresponding finite-space Lyapunov exponent approaches the Lyapunov exponent of a chaotic map when M→∞, where M is the cardinality of the discrete phase space. In analogy with continuous systems, we say the system has pseudochaos if its finite-space Lyapunov exponent tends to a positive number (or to +∞), when M→∞. Affiliations:
Kocarev L. | - | University “Kiril i Metodij” (MK) | Szczepański J. | - | IPPT PAN |
| |
29. |
Szczepański J., Wajnryb E., Amigó J.M.♦, Sanchez-Vives M.V.♦, Slater M.
♦, Biometric random number generators,
COMPUTERS AND SECURITY, ISSN: 0167-4048, DOI: 10.1016/S0167-4048(04)00064-1, Vol.23, No.1, pp.77-84, 2004Abstract: Up to now biometric methods have been used in cryptography for authentication purposes. In this paper we propose to use biological data for generating sequences of random bits. We point out that this new approach could be particularly useful to generate seeds for pseudo-random number generators and so-called “key sessions”. Our method is very simple and is based on the observation that, for typical biometric readings, the last binary digits fluctuate “randomly”. We apply our method to two data sets, the first based on animal neurophysiological brain responses and the second on human galvanic skin response. For comparison we also test our approach on numerical samplings of the Ornstein–Uhlenbeck stochastic process. To verify the randomness of the sequences generated, we apply the standard suite of statistical tests (FIPS 140-2) recommended by the National Institute of Standard and Technology for studying the quality of the physical random number generators, especially those implemented in cryptographic modules. Additionally, to confirm the high cryptographic quality of the biometric generators, we also use the often recommended Maurer's universal test and the Lempel–Ziv complexity test, which estimate the entropy of the source. The results of all these verifications show that, after appropriate choice of encoding and experimental parameters, the sequences obtained exhibit excellent statistical properties, which opens the possibility of a new design technology for true random number generators. It remains a challenge to find appropriate biological phenomena characterized by easy accessibility, fast sampling rate, high accuracy of measurement and variability of sampling rate. Affiliations:
Szczepański J. | - | IPPT PAN | Wajnryb E. | - | IPPT PAN | Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Sanchez-Vives M.V. | - | ICREA-IDIBAPS (ES) | Slater M.
| - | other affiliation |
| |
30. |
Szczepański J., Amigó J.M.♦, Wajnryb E., Sanchez-Vives M.V.♦, Characterizing spike trains with Lempel-Ziv complexity,
NEUROCOMPUTING, ISSN: 0925-2312, DOI: 10.1016/j.neucom.2004.01.026, Vol.58-60, pp.79-84, 2004Abstract: We review several applications of Lempel–Ziv complexity to the characterization of neural responses. In particular, Lempel–Ziv complexity allows to estimate the entropy of binned spike trains in an alternative way to the usual method based on the relative frequencies of words, with the definitive advantage of no requiring very long registers. We also use complexity to discriminate neural responses to different kinds of stimuli and to evaluate the number of states of neuronal sources. Keywords: Lempel–Ziv complexity, Entropy, Spike trains, Neuronal sources Affiliations:
Szczepański J. | - | IPPT PAN | Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Wajnryb E. | - | IPPT PAN | Sanchez-Vives M.V. | - | ICREA-IDIBAPS (ES) |
| |
31. |
Amigó J.M.♦, Szczepański J., Wajnryb E., Sanchez-Vives M.V.♦, Estimating the Entropy Rate of Spike Trains via Lempel-Ziv Complexity,
Neural Computation, ISSN: 0899-7667, DOI: 10.1162/089976604322860677, Vol.16, No.4, pp.717-736, 2004Abstract: Normalized Lempel-Ziv complexity, which measures the generation rate of new patterns along a digital sequence, is closely related to such important source properties as entropy and compression ratio, but, in contrast to these, it is a property of individual sequences. In this article, we propose to exploit this concept to estimate (or, at least, to bound from below) the entropy of neural discharges (spike trains). The main advantages of this method include fast convergence of the estimator (as supported by numerical simulation) and the fact that there is no need to know the probability law of the process generating the signal. Furthermore, we present numerical and experimental comparisons of the new method against the standard method based on word frequencies, providing evidence that this new approach is an alternative entropy estimator for binned spike trains. Affiliations:
Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Szczepański J. | - | IPPT PAN | Wajnryb E. | - | IPPT PAN | Sanchez-Vives M.V. | - | ICREA-IDIBAPS (ES) |
| |
32. |
Amigó J.M.♦, Szczepański J., Wajnryb E., Sanchez-Vives M.V.♦, On the number of states of the neuronal sources,
BIOSYSTEMS, ISSN: 0303-2647, DOI: 10.1016/S0303-2647(02)00156-9, Vol.68, No.1, pp.57-66, 2003Abstract: In a previous paper (Proceedings of the World Congress on Neuroinformatics (2001)) the authors applied the so-called Lempel–Ziv complexity to study neural discharges (spike trains) from an information-theoretical point of view. Along with other results, it is shown there that this concept of complexity allows to characterize the responses of primary visual cortical neurons to both random and periodic stimuli. To this aim we modeled the neurons as information sources and the spike trains as messages generated by them. In this paper, we study further consequences of this mathematical approach, this time concerning the number of states of such neuronal information sources. In this context, the state of an information source means an internal degree of freedom (or parameter) which allows outputs with more general stochastic properties, since symbol generation probabilities at every time step may additionally depend on the value of the current state of the neuron. Furthermore, if the source is ergodic and Markovian, the number of states is directly related to the stochastic dependence lag of the source and provides a measure of the autocorrelation of its messages. Here, we find that the number of states of the neurons depends on the kind of stimulus and the type of preparation ( in vivo versus in vitro recordings), thus providing another way of differentiating neuronal responses. In particular, we observed that (for the encoding methods considered) in vitro sources have a higher lag than in vivo sources for periodic stimuli. This supports the conclusion put forward in the paper mentioned above that, for the same kind of stimulus, in vivo responses are more random (hence, more difficult to compress) than in vitro responses and, consequently, the former transmit more information than the latter. Keywords: Spike trains, Encoding, Lempel–Ziv complexity, Entropy, Internal states, Numerical invariants for neuronal responses Affiliations:
Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Szczepański J. | - | IPPT PAN | Wajnryb E. | - | IPPT PAN | Sanchez-Vives M.V. | - | ICREA-IDIBAPS (ES) |
| |
33. |
Amigó J.M.♦, Szczepański J., Approximations of Dynamical Systems and Their Applications to Cryptography,
International Journal of Bifurcation and Chaos in Applied Sciences and Engineering, ISSN: 0218-1274, DOI: 10.1142/S0218127403007771, Vol.13, No.7, pp.1937-1948, 2003Abstract: During the last years a new approach to construct safe block and stream ciphers has been developed using the theory of dynamical systems. Since a block cryptosystem is generally, from the mathematical point of view, a family (parametrized by the keys) of permutations of n-bit numbers, one of the main problems of this approach is to adapt the dynamics defined by a map f to the block structure of the cryptosystem. In this paper we propose a method based on the approximation of f by periodic maps Tn (v.g. some interval exchange transformations). The approximation of automorphisms of measure spaces by periodic automorphisms was introduced by Halmos and Rohlin. One important aspect studied in our paper is the relation between the dynamical properties of the map f (say, ergodicity or mixing) and the immunity of the resulting cipher to cryptolinear attacks, which is currently one of the standard benchmarks for cryptosystems to be considered secure. Linear cryptanalysis, first proposed by M. Matsui, exploits some statistical inhomogeneities of expressions called linear approximations for a given cipher. Our paper quantifies immunity to cryptolinear attacks in terms of the approximation speed of the map f by the periodic Tn. We show that the most resistant block ciphers are expected when the approximated dynamical system is mixing. Affiliations:
Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Szczepański J. | - | IPPT PAN |
| |
34. |
Szczepański J., Amigó J.M.♦, Wajnryb E., Sanchez-Vives M.V.♦, Application of Lempel–Ziv complexity to the analysis of neural discharges,
Network: Computation in Neural Systems, ISSN: 0954-898X, DOI: 10.1088/0954-898X_14_2_309, Vol.14, No.2, pp.335-350, 2003Abstract: Pattern matching is a simple method for studying the properties of information sources based on individual sequences (Wyner et al 1998 IEEE Trans. Inf. Theory 44 2045–56). In particular, the normalized Lempel–Ziv complexity (Lempel and Ziv 1976 IEEE Trans. Inf. Theory 22 75–88), which measures the rate of generation of new patterns along a sequence, is closely related to such important source properties as entropy and information compression ratio. We make use of this concept to characterize the responses of neurons of the primary visual cortex to different kinds of stimulus, including visual stimulation (sinusoidal drifting gratings) and intracellular current injections (sinusoidal and random currents), under two conditions (in vivo and in vitro preparations). Specifically, we digitize the neuronal discharges with several encoding techniques and employ the complexity curves of the resulting discrete signals as fingerprints of the stimuli ensembles. Our results show, for example, that if the neural discharges are encoded with a particular one-parameter method (‘interspike time coding’), the normalized complexity remains constant within some classes of stimuli for a wide range of the parameter. Such constant values of the normalized complexity allow then the differentiation of the stimuli classes. With other encodings (e.g. ‘bin coding’), the whole complexity curve is needed to achieve this goal. In any case, it turns out that the normalized complexity of the neural discharges in vivo are higher (and hence carry more information in the sense of Shannon) than in vitro for the same kind of stimulus. Affiliations:
Szczepański J. | - | IPPT PAN | Amigó J.M. | - | Universidad Miguel Hernández-CSIC (ES) | Wajnryb E. | - | IPPT PAN | Sanchez-Vives M.V. | - | ICREA-IDIBAPS (ES) |
| |
35. |
Szczepański J., Michałek T., Random Fields Approach to the Study of DNA Chains,
Journal of Biological Physics, ISSN: 0092-0606, DOI: 10.1023/A:1022508206826, Vol.29, pp.39-54, 2003Abstract: We apply the random field theory tothe study of DNA chains which we assume tobe trajectories of a stochastic process. Weconstruct statistical potential betweennucleotides corresponding to theprobabilities of those trajectories thatcan be obtained from the DNA data basecontaining millions of sequences. It turnsout that this potential has aninterpretation in terms of quantitiesnaturally arrived at during the study ofevolution of species i.e. probabilities ofmutations of codons. Making use of recentlyperformed statistical investigations of DNAwe show that this potential has differentqualitative properties in coding andnoncoding parts of genes. We apply ourmodel to data for various organisms andobtain a good agreement with the resultsjust presented in the literature. We alsoargue that the coding/noncoding boundariescan corresponds to jumps of the potential. Keywords: codons, DNA chain, entropy, exons, introns, mutation, random field, stochastic process Affiliations:
Szczepański J. | - | IPPT PAN | Michałek T. | - | IPPT PAN |
| |
36. |
Szczepański J., A new result on the Nirenberg problem for expanding maps,
Nonlinear Analysis: Theory, Methods & Applications, ISSN: 0362-546X, DOI: 10.1016/S0362-546X(99)00180-7, Vol.43, No.1, pp.91-99, 2001Keywords: Nirenberg problem, Expanding map Affiliations:
Szczepański J. | - | IPPT PAN |
| |
37. |
Szczepański J., Kotulski Z., Pseudorandom Number Generators Based on Chaotic Dynamical Systems,
Open Systems & Information Dynamics, ISSN: 1230-1612, DOI: 10.1023/A:1011950531970, Vol.8, No.2, pp.137-146, 2001Abstract: Pseudorandom number generators are used in many areas of contemporary technology such as modern communication systems and engineering applications. In recent years a new approach to secure transmission of information based on the application of the theory of chaotic dynamical systems has been developed. In this paper we present a method of generating pseudorandom numbers applying discrete chaotic dynamical systems. The idea of construction of chaotic pseudorandom number generators (CPRNG) intrinsically exploits the property of extreme sensitivity of trajectories to small changes of initial conditions, since the generated bits are associated with trajectories in an appropriate way. To ensure good statistical properties of the CPRBG (which determine its quality) we assume that the dynamical systems used are also ergodic or preferably mixing. Finally, since chaotic systems often appear in realistic physical situations, we suggest a physical model of CPRNG. Affiliations:
Szczepański J. | - | IPPT PAN | Kotulski Z. | - | IPPT PAN |
| |
38. |
Amigó J.M.♦, Szczepański J., A Conceptual Guide to Chaos Theory,
Prace IPPT - IFTR Reports, ISSN: 2299-3657, No.9, pp.1-43, 1999 | |
39. |
Kotulski Z., Szczepański J., Górski K.♦, Paszkiewicz A.♦, Zugaj A.♦, Application of discrete chaotic dynamical systems in cryptography - DCC method,
International Journal of Bifurcation and Chaos in Applied Sciences and Engineering, ISSN: 0218-1274, DOI: 10.1142/S0218127499000778, Vol.9, No.6, pp.1121-1135, 1999Abstract: In the paper we propose a method of constructing cryptosystems, utilizing a nonpredictability property of discrete chaotic systems. We point out the requirements for such systems to ensure their security. The presented algorithms of encryption and decryption are based on multiple iteration of a certain dynamical chaotic system coming from gas dynamics models. A plaintext message specifies a part of the initial condition of the system (a particle's initial position). A secret key specifies the remaining part of initial condition (the particle's initial angle) as well as a sequence of discrete choices of the pre-images in the encryption procedure. We also discuss problems connected with the practical realization of such chaotic cryptosystems. Finally we demonstrate numerical experiments illustrating the basic properties of the proposed cryptosystem. Affiliations:
Kotulski Z. | - | IPPT PAN | Szczepański J. | - | IPPT PAN | Górski K. | - | Warsaw University of Technology (PL) | Paszkiewicz A. | - | Warsaw University of Technology (PL) | Zugaj A. | - | Warsaw University of Technology (PL) |
| |
40. |
Szczepański J., On a problem of Nirenberg concerning expanding maps in Hilbert space,
Proceedings of the American Mathematical Society, ISSN: 1088-6826, DOI: 10.2307/2159486, Vol.116, No.4, pp.1041-1044, 1992Abstract: Let H be a Hubert space and f: H —» H a continuous map which is expanding (i.e., ||f(x) — f(y)|| >= ||x - y|| for all x, y ε H) and such that f(H) has nonempty interior. Are these conditions sufficient to ensure that f is onto? This question was stated by Nirenberg in 1974. In this paper we give a partial negative answer to this problem; namely, we present an example of a map F: H —» H which is not onto, continuous, F(H) has nonempty interior, and for every x, y ε H there is no ε N (depending on x and y) such that for every n > no ||F^n(x)-F^n(y)||>=c^(n-m)||x-y|| where F^n is the nth iterate of the map F , c is a constant greater than 2, and m is an integer depending on x and y. Our example satisfies ||F(x)|| = c||x|| for all x ε H . We show that no map with the above properties exists in the finite-dimensional case. Affiliations:
Szczepański J. | - | IPPT PAN |
| |