Married functions

On Gödel Escher Bach, on page 137 there are two “married” functions defined:

for n >= 1:
F(n) = n – M(F(n – 1))
M(n) = n – F(M(n – 1))

for n = 0:
F(0) = 1, M(0) = 0

I wondered whether it is easy to define three function that increase their value every time n is increased by three, using only addition, subtracting and recursive calls to the previous value of the functions. It turns out it is not too hard to find some.

for n >= 1
F(n) = n – M(n – 1) – N(F(n – 1) + M(n – 1) + N(n – 1) – 2)
M(n) = F(n – 1)
N(n) = M(n – 1)

for n = 0:
F(0) = 1, M(0) = 0, N(0) = 0

Reverse recursively single linked list

There are two main ways to reverse a linked list: iteratively and recursively. I find the recursive solution easier to understand and cleaner. Unfortunately, I didn’t find any examples with tail recursion in the top results that I looked at, so I list my approach.

If someone wants to try their own approach, they can try on:
https://leetcode.com/problems/reverse-linked-list/

On marginalization

“I can’t bear to hear a human being spoken of with contempt just because of his group identification-even by other human beings. It’s these respectable people here who create those hooligans out there.”

The quote is from Dors from “Prelude to foundation” by Isaac Asimov.

“These respectable people here” refers to the snobbish class of people who made these comments. The “hooligans out there” refers to the marginalized part of the community who has developed violent behaviour.

Russell’s paradox

Let A be any set. We define B such as:
B = \{x \in A : x \notin x\}

Can it be that B \in A? No.

Proof by contradiction.

Assume B \in A.

  • If B \in B then by the definition of B:
    B \notin B. Contradiction.
  • If B \notin B then by the definition of B:
    B \in A \text{ and } B \notin B \implies B \in B. Contradiction.

Therefore B \notin A.

Credits for the succinct proof.

Some notes on Decision Theory

Once upon a time, I took a course in Decision Theory at ETH. Here are some notes:

normative theory vs descriptive theory (maths vs psychology)

prospect theory

options vs choices

money pump

bond vs shares

Main notions:

  1. Slide
    1. Normative: Perfect rationality, optimality, payoff maximality
    2. Descriptive: How real people think and make decisions.
      1. bias: persistent and systematic deviation from rationality
    3. Prescriptive: How can we help people to make better decisions. How can we structure the choice environment to minimize decision errors and biases.
    4. People use limited heuristics in making decisions
    5. Decision maker
      1. individual, couple, family, firm, country
      2. any monolithic cognitive agent
    6. Goals, preferences, wants, desires
    7. options, alternatives, courses of action
    8. beliefs
    9. Decision under
      1. certainty
        1. all options and outcomes are completely known to the DM (military diet, ups example)
      2. risk
        1. When all the choices and options are known but there are well defined stochastic nodes (gambling, insurance, investing)
      3. uncertainty
        1. options are not well defined, or probabilities are not defined (or both) (examples most of life)
    10. Preferences
      1. Revealed, Complete, Transitive, Independent, Stable
    11. People often satisfice instead of optimize
    12. People guess, estimate and try to do “well enough’’ rather than optimally.
    13. Rationality is bounded in two ways.
      1. Information from the environment
      2. cognitive capacity
    14. Sometimes close enough is fine, sometimes not
    15. Cognitive illusions: Monty hall problem
    16. Limitations in attention and memory. There are tools which compensate for that.
    17. Multi Attribute Evaluation (MAUT)
      1. identify options
      2. define goals
      3. quantify subjective evaluations
      4. weight and scale different evaluations so they can be aggregated meaningfully.
      5. make reasonable and justifiable decisions
    18. Problem, Objectives, Alternatives, Consequences, Tradeoffs (PrOACT)
    19. Comparative, multiple stakeholders (probably not equally important), multiple goals (maybe not equally important), multiple choices with multiple attibutes, judgements are necessary, quantify!
      1. Identify the decision maker
      2. Identify the alternatives
      3. the relevant attributes
      4. assign values to the attributes of each alternatives
      5. determine a weight for each attribute
      6. for each alternative, compute a weighted mean
      7. make a provisional decision
      8. get feedback and run a sensitivity analysis
    20. compensatory vs noncompensatory
      1. sometimes some alternatives are not acceptable (minimum requirements)
    21. minor changes may cause a change in the resulting decision
    22. “fair’’ method and easy to develop political support
    23. analytic hierarchy process: binary comparison of pairs of attributes in terms of their importance and then assign a weight (how much better is on alternative compared to the other)
    24. built in check for transitivity and consistency
    25. a well defined process to turn subjective judgements into more objective ratings
  2. Secretary problem
    1. What makes a good decision (good outcome or good reasons?)
    2. expected value
    3. probability (classical, frequency, subjective)
    4. it can be hard for people to reason about probabilities correctly (monty hall, other pitfalls)
    5. birthday problem
    6. bayes rule
    7. st petersburg gamble
    8. Moral worth: Bernoulli: the worth of something is not the same as its value → Definition of utility (utility = value^a)
    9. Expected utility does not work always
    10. Allais gambles. The result is a contradiction to EUT.
    11. Feelings about risk: People will accept 1000 times greater risks if they are their own volition, rather than involuntary
    12. O.J. SImpson example
    13. People are generally risk averse, but to varying degrees
    14. EVT defines rationality if the goal of the DM is to maximize value in the long run
    15. EUT defines rationality if the goal of the DM is to maximize utility (moral worth) in the long run
    16. Risk seeking in the domain of losses, risk averse in the domain of gains
    17. Decision framing
    18. Money trump
  3. Sequential investment problem
    1. Persistent cognitive noise
    2. systematic departures from EUT – biases
    3. Friedman-Savage utility function
    4. Prospect theory
      1. descriptive account of decision theory under risk
      2. aims to account for systematic deviations from eut and evt
      3. Coding — outcomes relative to the current state in terms of gains and losses
      4. combination — joining similar prospects
      5. segregation — isolation of sure things
      6. cancellation — discarding shared components
      7. simplification — rounding
      8. elimination of dominated options
    5. probability weighting function — the estimation that people do on probabilities (magnification of small values in the expense of larger values)
    6. 4 fold pattern of risk preferences
      1. risk seeking over low p gains — lotteries
      2. risk averse over low p losses — insurance
      3. risk averse over high p losses — certainty effect
      4. risk seeking over high p losses — end of the day effect
    7. certainty effect
      1. insurance which does not pay with a very small probability
    8. end of the day effect
      1. people make large bets on the last day to “break even’’
    9. Shortcomings
      1. average choices on average
      2. not for every one
      3. not even for someone over the domain of time
      4. no insight — just better predictions
      5. does not scale. (portfolio example, mulistage risky decisions)
    10. Dynamic decision contexts are defined by:
      1. sequential choices are made and they are interspersed with updated information about the environment
      2. the decision environment changes over time (depending on the choices of DM)
    11. normative approaches
      1. maximize expected payoff
      2. dynamic programming
    12. descriptive approaches
      1. case studies
      2. experimental methods (behavioral laboratories)
      3. human behaviour is compared to the normative approach
      4. identification of systematic departures from optimality — existent cognitive mechanisms can be accounted for
      5. ⇒ people predictably irrational (underlying assumption)
    13. optimal stopping
      1. example: secretary problem
      2. target: select a maximum value
      3. full information games
        1. perfectly known distribution
        2. draw a number from [0,20] — given the relative order of the number that you encountered so far but not the number
      4. partial information
        1. known distribution but not its parameters
        2. german tank problem
      5. no information
        1. iid random variables with unknown distribution
        2. secretary problem
        3. Sometimes called the Sultan’s Dowry Problem, the Marriage Problem, or the Fussy Suitor Problem
        4. optimal decision policy — threshold (e^{-1})
        5. generalized secretary problem
        6. persistent bias for stopping too soon
        7. probability distortion and not risk aversion is the (assumed) answer
      6. Other dynamic decision problem
        1. sell a good over a time interval where offers in a specific range arrive with a probability
    14. Kelly criterion
      1. always bet the same proportion (2p – 1)
      2. maximizes the expected geometric mean
      3. poor criterion in prediction decision makers behavior
      4. people have strong urge to adjust their investments
      5. evidence of learning to adjust less over rounds
      6. if we charge people when they adjust, we help them
    15. sequential investment problem
      1. optimal policy: bet max or not according to whether we are “above or below the expectation’’
    1. Heuristics as rule of thumb.
    2. gambler’s fallacy (law of small numbers)
    3. hot hand fallacy (law of small numbers)
    4. stars
    5. availability heuristics (how easy it is to recall a piece of information)
    6. anchoring and adjustment
      1. factorial where presented the larger number first or last
      2. use of early values to predict
    7. framing effect (the influence of questions)
    8. confirmation bias
    9. are people really irrational — heuristics evolved for some reason. Carefully designed corner case are examined.
    1. public goods game
      1. selfish players take no interest
      2. social optimal is far off
      3. many people are pro-social
      4. design vector according to the answers
    2. overconfidence quiz
      1. attentional
        1. selective memory/information search: confirmation bias
        2. selective encoding: excuses/rationalizations
        3. selective encoding : rewarded events are remembered better
      2. motivational
        1. need to appear competent and confident to tothers and oneself. confidence and optimism help to get things done.
    1. bordeaux and regression
    2. wisdom of crowds
    3. risky decision theory and gambling
    4. rush of winning
    5. highly reinforcing

Hofstadter sequence

On the book Gödel, Escher, Bach: an Eternal Golden Braid, on Chapter III page 73 there is a puzzle on a sequence of numbers:

1  3  7  12  18  26  35  45  56  69  …

The puzzle is about characterising the sequence of numbers, that is, coming up with a way to find the next number of the sequence. You may want to try it yourself before reading the rest of the post.

The section of the chapter where this puzzle belongs talks about Recursive Enumerable Sets vs Recursive Sets. The main point of the chapter is that:

There exist formal systems whose set of non-theorems is not the set of theorems of any formal system.

When the author presents the puzzle, he calls for some reflection on the chapter in order to find a formula that describes the sequence.

Trying to solve that, I started observing the differences between subsequent numbers, which in our case is

2  4  5  6  8  9  10  11  13

And then again I looked at the differences of between those numbers

2  1  1  2  1  1  1  2

Now, this sequence rings a bell. The notes in music have differences in tones and semitones:

si (1/2) do (1/1) re (1/1) mi (1/2) fa (1/1) sol (1/1) la (1/1) si

where 1/2 represents a semitone and 1/1 represents a tone.

If we look carefully, we notice that the last sequence of differences that we found is the same one as the one in tones and semitones where the fractions are reversed.

I thought this could be a solution since one of the main topics of the book is the interplay of music and maths. On the other hand, the discussion on the chapter was not used. By looking up the description of the sequence on the web, I found the Hofstadter sequence. Looking at the sequence, I had doubts that I had found an equivalent definition.

Finally, a simple Python script to check whether the two sequences are the same.

Apparently, they are not equivalent. Moreover, we don’t even know whether the Hofstandter sequence is defined for all n.

Nevertheless, the beginning of the sequence presents a funny coincidence and writing the Python code to check the non-equivalence of the two sequences made me appreciate even more the power of generators in Python.

On diversity

“The people of the Settlements select themselves. They select people like themselves. There’s a shared culture, even, to some extent, a shared physical appearance on each Settlement. Earth, on the other hand, is, and through all of history has been, a wild mixture of cultures, all enriching each other, competing with each other, suspicious of each other. Tanayama and many other Earthmen– myself, for instance–consider such a mixture to be a source of strength, and feel that cultural homogeneity on the Settlements weakens them and, in the long run, shortens their potential life span.”

From the book Nemesis by Asimov, page 123.