Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

On the Additive and Subtractive Representation of Even Numbers from Primes

Version 1 : Received: 1 October 2021 / Approved: 5 October 2021 / Online: 5 October 2021 (14:48:18 CEST)
Version 2 : Received: 24 November 2021 / Approved: 25 November 2021 / Online: 25 November 2021 (20:28:09 CET)
Version 3 : Received: 4 December 2021 / Approved: 7 December 2021 / Online: 7 December 2021 (20:52:47 CET)
Version 4 : Received: 12 January 2022 / Approved: 14 January 2022 / Online: 14 January 2022 (14:43:54 CET)
Version 5 : Received: 29 June 2022 / Approved: 30 June 2022 / Online: 30 June 2022 (03:50:17 CEST)
Version 6 : Received: 9 July 2022 / Approved: 11 July 2022 / Online: 11 July 2022 (04:28:25 CEST)
Version 7 : Received: 21 July 2022 / Approved: 22 July 2022 / Online: 22 July 2022 (03:38:39 CEST)
Version 8 : Received: 4 August 2022 / Approved: 5 August 2022 / Online: 5 August 2022 (03:41:27 CEST)
Version 9 : Received: 17 August 2022 / Approved: 18 August 2022 / Online: 18 August 2022 (03:53:01 CEST)
Version 10 : Received: 27 August 2022 / Approved: 30 August 2022 / Online: 30 August 2022 (03:11:25 CEST)
Version 11 : Received: 12 September 2022 / Approved: 13 September 2022 / Online: 13 September 2022 (10:15:57 CEST)
Version 12 : Received: 26 September 2022 / Approved: 27 September 2022 / Online: 27 September 2022 (03:49:33 CEST)
Version 13 : Received: 10 October 2022 / Approved: 11 October 2022 / Online: 11 October 2022 (03:22:25 CEST)
Version 14 : Received: 14 November 2022 / Approved: 15 November 2022 / Online: 15 November 2022 (06:35:57 CET)
Version 15 : Received: 21 November 2022 / Approved: 22 November 2022 / Online: 22 November 2022 (03:01:41 CET)
Version 16 : Received: 12 December 2022 / Approved: 13 December 2022 / Online: 13 December 2022 (02:17:30 CET)
Version 17 : Received: 29 December 2022 / Approved: 30 December 2022 / Online: 30 December 2022 (03:45:58 CET)
Version 18 : Received: 10 July 2023 / Approved: 12 July 2023 / Online: 13 July 2023 (10:01:00 CEST)
Version 19 : Received: 8 August 2023 / Approved: 9 August 2023 / Online: 10 August 2023 (05:23:00 CEST)
Version 20 : Received: 8 February 2024 / Approved: 9 February 2024 / Online: 9 February 2024 (11:19:03 CET)
Version 21 : Received: 10 February 2024 / Approved: 12 February 2024 / Online: 13 February 2024 (07:15:16 CET)

How to cite: Shehu, A.; Uka, J. On the Additive and Subtractive Representation of Even Numbers from Primes. Preprints 2021, 2021100087. https://doi.org/10.20944/preprints202110.0087.v1 Shehu, A.; Uka, J. On the Additive and Subtractive Representation of Even Numbers from Primes. Preprints 2021, 2021100087. https://doi.org/10.20944/preprints202110.0087.v1

Abstract

We demonstrate a new quantitative method to the sieve of Eratosthenes, which is an alternative to the sieve of Legendre. In this method, every element of a given set is sifted out once only, and therefore, this method is free of the Mobius function and of the consequential parity barrier. Using this method, we prove that every sufficiently large even number is the sum of two primes, and that every even number is the difference of two primes in infinitely many ways.

Keywords

Sieve of Eratosthenes; Goldbach’s conjecture; Polignac’s conjecture; Twin Prime conjecture

Subject

Computer Science and Mathematics, Algebra and Number Theory

Comments (1)

Comment 1
Received: 14 October 2021
Commenter:
The commenter has declared there is no conflict of interests.
Comment: I'm very much looking forward to feedback and suggestions from those who've read the paper.
+ Respond to this comment

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 1
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.