zyxzevn | 1 month ago | on: Carl Sagan's Baloney Detection Kit: Tools for Thinking Critically (2025)
zyxzevn's comments
zyxzevn | 2 months ago | on: Trump says Venezuela’s Maduro captured after strikes
zyxzevn | 2 months ago | on: NYC Mayoral Inauguration bans Raspberry Pi and Flipper Zero alongside explosives
zyxzevn | 2 months ago | on: Inca Stone Masonry
The older construction is made of very big stones of hard granite, that fit perfectly together. Assuming they had some concrete, it is easy how they were able to make them fit so perfectly. If you have a source of materials, concrete is not difficult to make. See https://www.geopolymer.org/
People were not stupid, and technologies were invented and forgotten. And just like Roman technologies were lost in the middle ages, this building technology was lost to the Incas.
The Incas build their houses and temples on top of the existing ones. They used smaller stones that did not fit well together. Still a great culture, but with different technologies.
South America has a lot of cultures that disappeared. They had no written history and a lot of stuff was destroyed by later cultures (including the Spanish). So it is impossible for historians to get it right.
For example there were also people with elongated skulls and red hair in Peru. Could be a result of inbreeding as they also had some other physiological differences. Maybe exterminated by another tribe. https://www.youtube.com/watch?v=5dfpLN3FbQs
History is often full with conflicts, but presented as if it is all known. There are often conflicts with engineers who point out different technologies used for buildings and such. These technologies do not fit in the simplified timeline of mainstream history.
This difference in technology is obvious regarding the extremely accurate Egyptian granite vases https://www.youtube.com/watch?v=7BlmFKSGBzI and granite boxes.
zyxzevn | 3 months ago | on: Fifty Shades of OOP
The problem is that the components are often connected to different interfaces/graphs. Components can never be fully separated due to debug, visualization and storage requirements.
In non-OOP systems the interfaces are closed or absent, so you get huge debug, visualization and storage functions that do everything. On addition to the other functionality. And these functions need to be updated for each different type of data. The complexity moves to a different part. But most importantly, any new type requires changes to many functions. This affects a team and well tested code. If your product is used by different companies with different requirements (different data types), your functions become overly complex.
zyxzevn | 3 months ago | on: New magnetic component discovered in the Faraday effect
There is also evidence that "photons" are just thresholds in the material that is used to detect light. The atoms vibrate with the EM-wave and at a certain threshold they switch to a higher vibration state that can release an electron. If the starting state is random, the release of an electron will often coincide with the light that is transmitted from just one atom.
This threshold means that one "photon" can cause zero or multiple detections. This was tested by Eric Reiter in many experiments and he saw that this variation indeed happens. Especially when the experiment is tuned to reveal this. By using high frequency light for example. It happens also in experiments done by others, but they disregarded the zero or multiple detections as noise. I think the double detection effect was discovered when he worked in the laboratory with ultraviolet light.
Here is a paper about Eric Reiter's work: https://progress-in-physics.com/2014/PP-37-06.PDF And here is his book. https://drive.google.com/file/d/1BlY5IeTNdu1X6pRA5dnJvRq3ip6...
zyxzevn | 5 months ago | on: Hardware Stockholm Syndrome
Another option is Erlang. On the top level it is organized with micro-services instead of functions.
None of them are system languages. The old hardware had weird data and memory formats. With C a lot of assembler could be avoided to program this hardware. It came as a default with Unix and some other operating systems. Fortran and Pascal were kind of similar.
The most used default languages on most systems were for interpreters. So you got LISP and BASIC. There is no fast hardware for that. To get stuff fast, one needed to program assembler, unless there was a C-compiler available.
zyxzevn | 5 months ago | on: Solar leads EU electricity generation as renewables hit 54%
zyxzevn | 5 months ago | on: New bacteria, and two potential antibiotics, discovered in soil
zyxzevn | 6 months ago | on: The case against social media is stronger than you think
There will be a bias in moderation, but that will have less of an effect when there is no deletion. If possible, the user could choose their preferred style (or bias) of moderation. If you want full freedom, you can let users select "super-users" to moderate/categorize for them.
Emotional responses and troll jokes could be a separate categories as long they do not call for violence and or break other laws.
Consensus is still group-think. I think it is destructive without any clear view where it stands within other options or other ideas. Like: "why exactly is earth not the center". A lot of consensus is also artificial due to biased reporting, biased censorship and biased sponsorship. During discussions, people within a consensus tend to use logical fallacies. Like portraying the opposition as idiots, or avoiding any valid points that the opposition bring into the discussion.
I think that people have becomes less intelligent due to one-sided reporting of information. With extra information, people will become smarter and more understanding of how other (smart) people think.
zyxzevn | 6 months ago | on: The case against social media is stronger than you think
Different opinions do matter. But due to the algorithms, the most emotional responses are promoted. There is no way to promote facts or what people think are facts.
So most discussion will be extremely emotional and not based on facts and their value. This is even true in scientific discussions.
Combined with group-think, these emotions can grow and lead to catastrophic outcomes.
zyxzevn | 6 months ago | on: Weird CPU architectures, the MOV only CPU (2020)
The MOVE architectures may work best with digital signal processors, because the data-flow is almost constant in such processors.
I invented my own version of the move only architecture (around 1992), but focused on speed. So here is my idea below.
1. The CPU only moves within the CPU, like from one register to the other. So all moves are extremely fast.
2. The CPU is separated in different units that can do work separately. Each unit has different input and output ports. The ports and registers are connected via a bus.
3. The CPU can have more buses and thus do more moves at the same time. If an output-data is not ready, the instruction will wait.
Example instruction: OUT1 -> IN1, OUT2 -> IN2 With 32 bits it would give give 8 units with 32 ports each.
Example of some set of units and ports. Control unit: (JUMP_to_address, CALL_to_address, RETURN_with_value, +conditionals) Memory unit: (STORE_Address, STORE_Value, READ_Address, READ_Value), Computation unit: (Start_Value, ADD_Value, SUB_Value, MUL_Value, DIV_Value, Result_Value) Value unit: (Value_from_next_instruction, ZERO, ONE) Register unit: (R0 ... R31)
It is extremely flexible. I also came up with a minimalist 8 bit version. One could even "plug-in" different units for different systems. Certain problems could be solved with adding special ports, which would work like a special instruction.
I did not continue the project due to people not understanding the bus architecture (like a PCI-bus). If you try to present it in a logical-gate architecture (like in the article), the units make the architecture more complicated than it actually is.
zyxzevn | 5 years ago | on: How NASA Designed a Helicopter That Could Fly Autonomously on Mars
zyxzevn | 5 years ago | on: Critical Thinking Isn't Just a Process
zyxzevn | 5 years ago | on: Israeli study finds 94% drop in symptomatic Covid-19 cases with Pfizer vaccine
The spring is coming, ending the wave like last year. This can reduce the cases by a similar amount. We will have to wait till next season what will happen. The sun and season has a dramatic influence on Covid. We saw that last year. Related with vitamin-D and ultra-violet radiation.
Recently some countries had changes in PCR tests. Austria had a similar reduction after not using the PCR tests. They had 95% false positives on PCR, compared to other tests. The PCR is problematic due to the usage of far too many cycles. When testing random persons, it gives 95% false positives over 35 cycles, but many countries even use 45 cycles or more. Sadly, using too many cycles is not scientific evidence for anything.
Many countries also reached a natural herd immunity. Sweden is now very close to getting 60% anti-body herd immunity. Besides that we have T-cell herd-immunity of which we do not know that much. I already had covid, got my anti-bodies & T-cell-immunity and am now fully immune.
Probably not applicable to Israel. Certain countries use cheap old-school medicine that prevent the disease. Like India and Ghana. Changes in medicine-use give similar changes in cases. If you don't believe it, just talk to some people that live there. They do not have a covid problem. IMHO this is the best but least profitable way to deal with the disease.
Controversial: The vaccines can give strong fevers. This is a well known side-effect. These might have thinned out the weakest people. Which now did not catch covid. While I think the effect is small, it can give a temporary reduction of the cases. I hope the medicine companies will improve the vaccines, so we do not get these side-effects.
The difference between these can be subtle and depends on raw data. But statistical analysis can already show some details.
zyxzevn | 5 years ago | on: Pseudophilosophy encourages confused, self-indulgent thinking
It is always funny how long boring sentences are found in philosophy. I think the problem is that our language is lacking in clarity for describing the subtle nuances of reasoning and logic.
And sadly it is a bit confusing. Especially if one (like me), does not know all the people listed in the paragraphs.
Personally I tend to shorten the sentences and focus on the presentation of the information in logical blocks. While it may not read as glorious, it can improve the clarity of the information. Maybe this tendency is related to my background in programming.
What problems did Foucault try to solve?
And I think that your statement is related to the differences in problems that Foucault tried to solve, and problems that we face now.
In his culture people believed that personal traits were related to skull shapes. So the idea that these facts about skull shapes do not really matter, seems very reasonable. Especially we now know that this does not relate at all.
So, the "facts" are related to "beliefs" that give certain values to those "facts". In science we would call those beliefs "models". And while well tested, they have some limits that are relevant.
Pseudo-philosophy or different belief systems?
Some philosophy wants to avoid "facts", here called pseudo-philosophy. But often they do not deny the "facts", but have complete different "beliefs" in which the "facts" do not have the same meaning.
The writer of the article wants to downgrade the "false beliefs" as "pseudo-philosophy". But as a solution, I prefer to look at the limits of the beliefs (or models) that are held on to. This is for both true beliefs and "false beliefs".
Talking about limitations gives an opportunity for dialogs and understanding different sides. People don't feel attacked but feel that they can contribute to mutual knowledge.
And regarding scientific beliefs/models, we may find problems with them, which is essential for scientific progress.
zyxzevn | 5 years ago | on: Accused murderer wins right to check source code of DNA testing kit
zyxzevn | 5 years ago | on: Programming for Cats
You you get C@
Subreddit for Cat related programmer jokes:
zyxzevn | 5 years ago | on: Magnetic waves explain mystery of Sun's outer layer
If I may explain it a bit further:
We measure magnetic fields in sunspots. And they are pretty stable. They are measured via the Zeeman effect, and show complex very strong magnetic fields (0.1 Tesla and higher). These fields require gigantic electric currents to be sustained. ( I= M*R^2 , with extreme large radius I= +-10^10 Ampere)
According to the mainstream astronomers, the magnetic should point outward. And this means that the currents must go around the sunspots. And we do not see any of such surrounding currents, nor do we have any idea where they could be.
Magnetism near moving conductors causes eddy-currents, which reduce the magnetic fields. So moving neutral plasma can never be the driver of such magnetic fields.
Yet, there is a clear solution. The sunspots have lines of plasma moving outwards or inwards. And these plasma lines are able to conduct strong currents easily.
So instead of a vertical magnetic field, we have a horizontal magnetic field around the plasma currents. And these plasma currents are conducting large amount of electricity like lightning.
It is also in line with the behaviour of solar flares. Some flares behave like plasma rail guns. Which works exactly as I explained.
zyxzevn | 5 years ago | on: Timing matters when correcting fake news
I think it needs another item in the list: For any theory/ hypothesis: how well does it stand against the null-hypothesis? For example: How much physical evidence is there really for the string-theory?
And I would upgrade this one: If there’s a chain of physical evidence (was argument), every link in the chain must work (including the premise) — not just most of them
And when breaking these items do not mean that something is false. It means that the arguments and evidence is incomplete. Don't jump to conclusions when you think that the arguments or evidence is invalid (that is how some people even think that the moonlanding was a hoax).