Logical Entropy
Open Access
Issue
4open
Volume 5, 2022
Logical Entropy
Article Number 1
Number of page(s) 33
Section Physics - Applied Physics
DOI https://doi.org/10.1051/fopen/2021004
Published online 13 January 2022
  1. Birkhoff G (1948), Lattice theory, American Mathematical Society, New York. [Google Scholar]
  2. Grätzer G (2003), General Lattice Theory, 2nd edn., Birkhäuser Verlag, Boston. [Google Scholar]
  3. Ellerman D (2010), The logic of partitions: introduction to the dual of the logic of subsets. Rev Symb Log 3, 287–350. https://doi.org/10.1017/S1755020310000018. [Google Scholar]
  4. Ellerman D (2014), An introduction to partition logic. Log J IGPL 22, 94–125. https://doi.org/10.1093/jigpal/jzt036. [Google Scholar]
  5. Lawvere FW, Rosebrugh R (2003), Sets for mathematics, Cambridge University Press, Cambridge, MA. [Google Scholar]
  6. Rota G-C (2001), Twelve problems in probability no one likes to bring up, in: H. Crapo, D. Senato (Eds.), Algebraic combinatorics and computer science: a tribute to Gian-Carlo Rota, Springer, Milano, pp. 57–93. [Google Scholar]
  7. Rota G-C (1998) Probability Vol. I & II: The guidi notes, MIT Copy Services, Cambridge, MA. [Google Scholar]
  8. Ellerman D (2021), The logical theory of canonical maps: the elements and distinctions analysis of the morphisms, duality, canonicity and universal constructions in Set. https://ArXiv.org, https://arxiv.org/abs/2104.08583. [Google Scholar]
  9. Halmos PR (1974), Measure theory, Springer-Verlag, New York. [Google Scholar]
  10. Rao KPSB, Rao MB (1983), Theory of charges: a study of finitely additive measures, Academic Press, London. [Google Scholar]
  11. Wilkins J (1707), Mercury or the secret and swift messenger, London. Original in 1641. [Google Scholar]
  12. Gleick J (2011), The information: a history, a theory, a flood, Pantheon, New York. [Google Scholar]
  13. Bateson G (1979), Mind and nature: a necessary unity, Dutton, New York. [Google Scholar]
  14. Gini C (1912), Variabilità e mutabilità, Tipografia di Paolo Cuppini, Bologna. [Google Scholar]
  15. Friedman WF (1922), The index of coincidence and its applications in cryptography, Riverbank Laboratories, Geneva IL. [Google Scholar]
  16. Kullback S (1976), Statistical methods in cryptanalysis, Aegean Park Press, Walnut Creek, CA. [Google Scholar]
  17. Rejewski M (1981), How Polish mathematicians deciphered the enigma. IEEE Ann Hist Comput 3, 213–234. [Google Scholar]
  18. Simpson EH (1949), Measurement of diversity. Nature 163, 688. [Google Scholar]
  19. Ricotta C, Szeidl L (2006), Towards a unifying approach to diversity measures: bridging the gap between the Shannon entropy and Rao’s quadratic index. Theor Popul Biol 70, 237–243. https://doi.org/10.1016/j.tpb.2006.06.003. [Google Scholar]
  20. Nei M (1973), Analysis of Gene Diversity in subdivided populations. Proc Nat Acad Sci USA 70, 3321–3323. [Google Scholar]
  21. Good IJ (1979), A.M. Turing’s statistical work in World War II. Biometrika 66, 393–396. [Google Scholar]
  22. Good IJ (1982), Comment (on Patil and Taillie: diversity as a concept and its measurement). J Am Stat Assoc 77, 561–563. [Google Scholar]
  23. Stigler SM (1999), Statistics on the table, Harvard University Press, Cambridge. [Google Scholar]
  24. Hirschman AO (1945), National power and the structure of foreign trade, University of California Press, Berkeley. [Google Scholar]
  25. Herfindahl OC (1950), Concentration in the US Steel Industry, Unpublished Doctoral Dissertation, Columbia University. [Google Scholar]
  26. Rao CR (1982), Diversity and dissimilarity coefficients: a unified approach. Theor Popul Biol 21, 24–43. [Google Scholar]
  27. Havrda J, Charvat F (1967), Quantification methods of classification processes: concept of structural α-entropy. Kybernetika (Prague) 3, 30–35. [Google Scholar]
  28. Tsallis C (1988), Possible generalization for Boltzmann-Gibbs statistics. J Stat Phys 52, 479–487. [Google Scholar]
  29. Brukner Č, Zeilinger A (2000), Operationally invariant information in quantum measurements. https://ArXiv.org. https://arxiv.org/abs/quant-ph/0005084, 19 May 2000. [Google Scholar]
  30. Brukner Č, Zeilinger A (2003), Information and fundamental elements of the structure of quantum theory, in: L. Castell, O. Ischebeck (Eds.), Time, quantum and information, Springer-Verlag, Berlin, pp. 323–354. [Google Scholar]
  31. Shannon CE (1948), A mathematical theory of communication. Bell Syst Tech J 27, 379–423, 623–656. [Google Scholar]
  32. Shannon CE, Weaver W (1964), The mathematical theory of communication, University of Illinois Press, Urbana. [Google Scholar]
  33. Shannon CE (1993), The Bandwagon, in: N.J.A. Sloane, A.D. Wyner (Eds.), Claude E. Shannon: Collected Papers, IEEE Press, Piscataway, NJ, p. 462. [Google Scholar]
  34. Tribus M (1978), Thirty years of information theory, in: R.D. Levine, M. Tribus (Eds.), The maximum entropy formalism, MIT, Cambridge, MA, pp. 1–14. [Google Scholar]
  35. Shannon CE (1993), Some topics in information theory, in: N.J.A. Sloane, A.D. Wyner (Eds.), Claude E. Shannon: Collected Papers, IEEE Press, Piscataway, NJ, pp. 458–459. [Google Scholar]
  36. Ramshaw JD (2018), The Statistical Foundations of Entropy, World Scientific Publishing, Singapore. [Google Scholar]
  37. Lewis GN (1930), The Symmetry of Time in Physics. Science 71, 569–577. [Google Scholar]
  38. Brillouin L (1962), Science and Information Theory, Academic Press, New York. [Google Scholar]
  39. Aczel J, Daroczy Z (1975), On Measures of Information and Their Characterization, Academic Press, New York. [Google Scholar]
  40. Campbell LL (1965), Entropy as a Measure. IEEE Trans Inform Theory IT-11, 112–114. [Google Scholar]
  41. Doob JL (1994), Measure Theory, Springer Science+Business Media, New York. [Google Scholar]
  42. Polya G, Szego G (1998), Problems and Theorems in Analysis, Vol. II, Springer-Verlag, Berlin. [Google Scholar]
  43. Hu KT (1962), On the amount of information. Probability Theory and Its Applications 7, 439–447. https://doi.org/10.1137/1107041. [Google Scholar]
  44. Ryser HJ (1963), Combinatorial Mathematics, Mathematical Association of America, Washington DC. [Google Scholar]
  45. Takacs L (1967), On the method of inclusion and exclusion. J Am Stat Assoc 62, 102–113. https://doi.org/10.1080/01621459.1967.10482891. [Google Scholar]
  46. Cover T, Thomas J (2006), Elements of information theory, 2nd edn., John Wiley and Sons, Hoboken, NJ. [Google Scholar]
  47. Csiszar I, Körner J (1981), Information theory: coding theorems for discrete memoryless systems, Academic Press, New York. [Google Scholar]
  48. Wilson RJ (1972), Introduction to graph theory, Longman, London. [Google Scholar]
  49. Rozeboom WW (1968), The theory of abstract partials: an introduction. Psychometrika 33, 133–167. [Google Scholar]
  50. McGill WJ (1954), Multivariate information transmission. Trans IRE Prof Group Inform Theory 4, 93–111. https://doi.org/10.1109/TIT.1954.1057469. [Google Scholar]
  51. Fano RM (1961), Transmission of Information, MIT Press, Cambridge, MA. [Google Scholar]
  52. Yeung RW (1991), A new outlook on Shannon’s information measures. IEEE Trans on Information Theory 37, 466–474. https://doi.org/10.1109/18.79902. [Google Scholar]
  53. MacKay DJC (2003), Information theory, inference, and learning algorithms, Cambridge University Press, Cambridge, UK. [Google Scholar]
  54. Atkins P, de Paula J, Keeler J (2018), Atkins’ physical chemistry, 11th edn., Oxford University Press, Oxford UK. [Google Scholar]
  55. Johnson E (2018), Anxiety and the equation: Understanding Boltzmann’s entropy, MIT Press, Cambridge, MA. [Google Scholar]
  56. Jaynes ET (2003), Probability theory: The logic of science, Cambridge University Press, Cambridge, UK. [Google Scholar]
  57. Kaplan W (1999), Maxima and minima with applications: practical optimization and duality, John Wiley & Sons, New York. [Google Scholar]
  58. Best MJ (2017), Quadratic programming with computer programs, CRC Press, Boca Raton FL. [Google Scholar]
  59. Jaynes ET (1978), Where do we stand on maximum entropy? in: R.D. Levine, M. Tribus (Eds.), The Maximum Entropy Formalism, MIT, Cambridge, MA, pp. 15–118. [Google Scholar]
  60. Papoulis A (1990), Probability and statistics, Prentice-Hall, Englewood Cliffs, NJ. [Google Scholar]
  61. Dantzig GB (1963), Linear programming and extensions, Princeton University Press, Princeton. [Google Scholar]
  62. Kullback S, Leibler RA (1951), On information and sufficiency. Ann Math Stat 22, 79–86. https://doi.org/10.1214/aoms/1177729694. [Google Scholar]
  63. Rao CR (2010), Quadratic entropy and analysis of diversity. Sankhyā Indian J Stat 72-A, 70–80. [Google Scholar]
  64. Zhang Y, Wu H, Cheng L (2012), Some new deformation formulas about variance and covariance, in: Proceedings of 2012 International Conference on Modelling, Identification and Control (ICMIC2012), pp. 987–992. [Google Scholar]
  65. McEliece RJ (1977), The theory of information and coding: a mathematical framework for communication (Encyclopedia of Mathematics and its Applications, Vol. 3). Addison-Wesley, Reading, MA. [Google Scholar]
  66. Tamir B, Cohen E (2015), A Holevo-type bound for a Hilbert Schmidt distance measure. J Quantum Inf Sci 5, 127–133. https://doi.org/10.4236/jqis.2015.54015. [Google Scholar]
  67. Ellerman D (2018), Logical entropy: introduction to classical and quantum logical information theory. Entropy 20, 679. https://doi.org/10.3390/e20090679. [Google Scholar]
  68. Auletta G, Fortunato M, Parisi G (2009), Quantum mechanics, Cambridge University Press, Cambridge, UK. [Google Scholar]
  69. Bennett CH (2003), Quantum information: qubits and quantum error correction. Int J Theor Phys 42, 153–176. https://doi.org/10.1023/A:1024439131297. [Google Scholar]
  70. Jaeger G (2007), Quantum information: an overview, Springer Science+Business Media, New York. [Google Scholar]
  71. Manfredi G, Feix MR (2000), Entropy and Wigner Functions. Phys Rev E 62, 4665–4674. https://doi.org/10.1103/PhysRevE.62.4665. [Google Scholar]
  72. Birkhoff G, Von Neumann J (1936), The logic of quantum mechanics. Ann Math 37, 823–843. [Google Scholar]
  73. Ellerman D (2017), Quantum mechanics over sets: a pedagogical model with non-commutative finite probability theory as its quantum probability calculus. Synthese 194, 4863–4896. https://doi.org/10.1007/s11229-016-1175-0. [Google Scholar]
  74. Ellerman D (2018), The quantum logic of direct-sum decompositions: the dual to the quantum logic of subspaces. Logic J IGPL 26, 1–13. https://doi.org/10.1093/jigpal/jzx026. [Google Scholar]
  75. Hoffman K, Kunze R (1961), Linear algebra, Prentice-Hall, Englewood Cliffs, NJ. [Google Scholar]
  76. Kolmogorov AN (1983), Combinatorial foundations of information theory and the calculus of probabilities. Russian Math Surv 38, 29–40. [Google Scholar]
  77. Zurek WH (2003), Decoherence, einselection, and the quantum origins of the classical. Rev Modern Phys 75, 715–775. [Google Scholar]
  78. Fano U (1957), Description of states in quantum mechanics by density matrix and operator techniques. Rev Mod Phys 29, 74–93. [Google Scholar]
  79. Nielsen M, Chuang I (2000), Quantum computation and quantum information, Cambridge University Press, Cambridge. [Google Scholar]
  80. Tamir B, Cohen E (2014), Logical entropy for quantum states. https://ArXiv.org. http://arxiv.org/abs/1412.0616v2 [Google Scholar]
  81. Ellerman D (2009), Counting distinctions: on the conceptual foundations of Shannon’s Information Theory. Synthese 168, 119–149. https://doi.org/10.1007/s11229-008-9333-7. [Google Scholar]
  82. Ellerman D (2021), New foundations for information theory: logical entropy and Shannon entropy, Springer Nature, Cham, Switzerland. [Google Scholar]
  83. Tamir B, Piava IL, Schwartzman-Nowik Z, Cohen E (2021), Quantum logical entropy: fundamentals and general properties. https://ArXiv.org. https://arxiv.org/abs/2108.02726. [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.