Tuesday, 3 February 2026

Kernel of inner products (2): the full solution

This is a sequel of my previous work on kernel of inner products.

Recall the original problem:

Q1. Consider two inner products $\langle \cdot , \cdot \rangle _1$ and $\langle \cdot , \cdot \rangle _2$. If $\langle x, y\rangle _1 = 0$ iff $\langle x, y\rangle _2 = 0$, show that $\langle \cdot , \cdot \rangle _1 = c \langle \cdot , \cdot \rangle _2$ for some $c>0$.

Two extended problems were posted but not clearly solved that time, so I would like to address further.

I have to admit that I was a bit silly when I looked for help in the realm of algebraic geometry, when there are big brothers around: matrix algebra and functional analysis. There are actually familiar tools and it turns out that this is not a question that is completely new, but rather something that is a generalization of existing result.

Q2. Consider the matrix form of Q1: suppose $A, B$ are real, symmetric and postive definite matrices. Suppose $x^tAy = 0$ iff $x^tBy = 0$ for any real vector $x, y$. Prove that $A = cB$ for some $c > 0$.

Last time we mentioned a contraposition by tracking the matrix entries. This is painful but possibly working, although there are more elegant solutions using matrix algebra as below.

First note that symmetric positive-definite means it is invertible with a symmetric square root which we denote $B^{1/2}$ coming with an inverse (also symmetric) $B^{-1/2}$. We then define $C = B^{-1/2}AB^{-1/2}$. It looks very similar to the term $B^{-1}A$ and it actually does in the sense that $A = cB$ iff $C = cI$. The way we split $B^{-1}$ is actually a not so rare trick and even made its appearance in my thesis (in forms of constructed operators). The idea is to allow us to swap in the inner product. 

Given $x,y$, define $u=B^{1/2}x$ and $v=B^{1/2}y$. This is just a transformation and is no big deal since $B$ is invertible. Now suppose $\langle u,Cv \rangle =0$. Expanding gives $\langle u,Cv \rangle =\langle B^{1/2}x, B^{-1/2}Ay \rangle = 0$. 

Since the square root is real symmetric, we can swap that to the other side and retrieve $\langle x, Ay \rangle =0=\langle x,By \rangle$ which is the original form. Furthermore, putting the square root back and we recover $0 = \langle B^{1/2}x, B^{-1/2}By  \rangle = \langle u,v \rangle$. That is, $\langle u,v \rangle =0$ iff $\langle u,Cv \rangle =0$. This is a very strong characterization! 

Since $C$ is symmetric positive-definite, it has full real positive eigenvalue and eigenspace. It suffices to prove that it has a single eigenvalue with full multiplicity. 

Suppose $C$ has eigenpairs $(\lambda_i, e_i)$ where $(e_i)$ form an orthonormal basis. For $i\neq j$ set $u = e_i+e_j$ and $v = e_i - e_j$, then $\langle u,v \rangle =0$. Therefore $\langle u,Cv \rangle = \lambda_i - \lambda_j =0$ which proves the claim.

This is simply another wonderful example demonstrating the power of utilizing matrix square root as an introduction to square roots of operators. When we teach about (infinitely differentiable) functions acting on matrices using power series we often apply fancy functions like exponential functions or trig functions. Among those the square root is a truly underestimated tool that is hugely influential in problems like this even before going into functional analysis. 

*

Speaking of functional analysis, do you still remember the next part of the question?

We know positive-definite matrices is strong enough to allow such properties. On the other side definiteness does seem too strong after all. For example, this is definitely true across all low dimensional matrices as we mentioned last time. i.e., let $A, B \in \mathbb{R}^{2\times 2}$ be symmetric, then $\langle x, Ay \rangle =0$ iff $\langle x, By \rangle =0$ for all $x,y$ implies $A = cB$ for some constant $c$.

But now what if I say this is also true for all symmtric bilinear forms? Let us recall a very similar result on linear forms:

Lemma 1. if $f,g \in V^*$ are two linear forms then $\ker f = \ker g$ iff $f =cg$ for some constant $c$.

The proof is similar to what we did last time. One of the direction is obvious so now we assume that $\ker f = \ker g$. Consider two linearly independent elements $x,y \in V$ such that $f(x), f(y) \neq 0$, then $g(x), g(y) \neq 0$ as well. (In case the cokernal is of rank 1 or lower the statement is trivial.) We can then write $f(x) = c_xg(x)$ and $f(y) = c_yg(y)$. 

Just like our original proof we can find $\alpha$ such that $f(x-\alpha y) = 0 = g(x - \alpha y)$. In fact we know $\alpha = f(x)/f(y)$ upon substitution. From here we calculate 
$0 = g(x-\alpha y) = g(x) - (f(x)/f(y)g(y)$
$= g(x) - (c_xg(x)/c_yg(y))g(y) = g(x) - (c_x/c_y)g(x)$. 
Since $g(x)$ is non-zero we conclude that $c_x = c_y$.

This is such a simplified version of a tiny result in Rudin's book...I should have realized earlier!

We then notice that our problem is just the multilinear version of the above:

Q3. Consider two symmetric bilinear forms $a, b: V\times V\to \mathbb{R}$. If the kernel of $a$ and $b$ are equal then $a = cb$ for some non-zero constant $c$.

Traditionally kernel is only applicable on linear functions but allow me to abuse that term and proceed...

WLOG we assume that the bilinear forms are non-zero. With the above linear version in mind, all we need is to extend that to both dimensions. Fix a $y \in V$, denote $a(\cdot, y), b(\cdot, y)$ to be $f_y, g_y$ respectively, then $f_y, g_y \in V^*$ are functionals with equal kernel. By Lemma 1 there exists constant $c_y$ such that $f_y = c_yg_y$ for all $y\in V$. We can define $c_y$ for every $y\in V$ where $f_y$ is non-trivial. Since the bilinear forms are non-zero we can define $c_y$ for at least some $y\in V$. The aim is to prove that $c_y$ is consistent across $V$.

Case 1: consider the case where the dimension of $\left\{ g_y \right\}$ is 1.

Since the dimension is 1, the space is spanned by a single functional, say $g$. Then we have $f_y = \alpha (y) g$ and $g_y = \beta (y) g$. For some functions $\alpha, \beta: V\to \mathbb{R}$. It turns out that $\alpha, \beta$ are linear using linearity of the bilinear forms. 

Since the kernel of $f_y, g_y$ are equal over $y$ (we have to be careful on which variable are we talking about), we know that $\alpha (y) = 0$ iff $\beta (y) = 0$. This is because $g \neq 0$ or else the bilinear form is trivial. Lemma 1 tells us that $\alpha = c\beta$ for some constant $c$. That is, $f_y = cg_y$ for that constant.

Case 2: consider the case where $\left\{ g_y \right\}$ is more than 1. Then we consider a basis (hello axiom of choice!) $\left\{ g_{y_i} \right\}$ and show that $c_{y_i} = c_{y_j}$ for any $i\neq j$ by linearity:

$f_{y_i+y_j} = f_{y_i} + f_{y_j} = c_{y_i}g_{y_i} + c_{y_j}g_{y_j}$
$f_{y_i+y_j} = c_{y_i+y_j}g_{y_i+y_j} = c_{y_i+y_j}(g_{y_i}+g_{y_j})$

Taking difference gives
$(c_{y_i} - c_{y_i+y_j})g_{y_i} + (c_{y_j}-c_{y_i+y_j})g_{y_j} = 0$, 
which gives $c_{y_i} = c_{y_i+y_j} = c_{y_j}$ by independence.

That says, $f_{y_i} = cg_{y_i}$ for every basis vector $g_{y_i}$, and that extends to every $g_y$ by linear combination. $\square$

*

Wait...did we even use symmetry?

Theorem 1. Two bilinear forms shares the same kernel iff they are a non-zero constant multiple of each other.

That's it. Perhaps the most possibly that we can achieve. It is just like showing something being $C^1$ in a space with $C^1$ boundary...you can't ask for more.

Should we find out that there this is not true -- then certainly there are a lot more to ask. For example, we can discuss the maximal (why does it exist?) subspace of $Bil(V)$ so that the property holds.

Such subspace certainly contains more than just symmetric bilinear forms. For example one can show that alternating forms do satisfy such property as well. Is the maximal subset something non-trivial between subspace of symmetric forms and the whole form space? Can we characterize such space, is it anything special with properties?...

But no, we got the perfect answer.

On the other hand, it doesn't seem like to much of a surprise once we quote the lemma. This is just a natural extension of the linear case. There are also other deeper reasons why that should be the case as well. For one, we quote a fundamental result in algebraic geometry, the Nullstellensatz. In a simple application, polynomials with equal zero sets should generate the same ideal. Linear forms gives linear polynomial and bilinear forms are simply quadratic ones. Of course we require the field to be algebraically closed in order to apply Nullstellensatz (which is not necessarily true here), but the spirit is the same.

There are still a lot we can look into from here on, especially the geometry in it:

- The kernel equality is an equivalence relation that partitions the form space into subsets of forms where the zero sets is the same. The result is saying that these partitions divides the bilinear form space into rank 1 fibers. 
- Elements in the quotient space are not subspaces (since zero is absent), but it can be identified in a projective space. In fact, it is isomorphic to the projective space $\mathbb{P}^{n^2-1}$ when $\dim (V) = n$. 
- We often define metric via bilinear (or higher order) forms. The result provides rigidity about orthogonality structure of the bilinear forms in the sense that there can only be one metric (up to multiples) that preserves a given orthogonality structure. And that brings us to conformal...*cough* I better not to talk about that before exposing myself too much (^.^)

Oh well, there goes our fun little problem and welcome to 2026. I can't promise I would try my best to bring up more interesting math discussions from here on!

PS: now with the full proof in mind, can you prove the same statement if written in matrix algebra language? Is it possible to write the proof without using the matrix version of the above proof (i.e. the use of null space etc.)? If that's too hard, can we do that for some special cases like when $A$ is invertible or symmetric?

Tuesday, 20 January 2026

20/1/2026: 一戰封神/觀音樓/速通

2026年的一月上旬如果要說發生過甚麼大事的話答案就只有一個,那就是在大統領的英明指揮下美軍活捉了委內瑞拉總統馬杜羅,把委國從獨裁中解救出來(大概吧)。

中國外交部用上了「極為震驚」的措詞,但更震驚的應該是速通失敗打了四年仗的俄羅斯吧。他們震驚的不是為何堂堂大國可以如此無視國際法跨國抓人,而是震驚於一戰封神的竟不是他們的土木國防部長。你以為這是段子,但打開各種親俄tg軍事群組的話就不難發現他們真是這樣想的。他們之所以會這樣反應,其中一個原因就是如果四年前他們真的能夠速通(speedrun)烏克蘭,一小時二十二分拿下勝利的話,他們又怎會受這四年的痛苦呢。

法律和國際關係的部分先拋開不談。網上最大的討論應該是Grok的自動改圖?本來網民都用這功能來生成色色,結果看到那張馬杜羅被兩位DEA夾著的照片立馬忍不住出手改圖。改成委內瑞拉士兵反抓大統領,又或是各種動漫人物被生擒。只能說網民除了色色以外還有著別的公約數的。

題外話抱怨一下BBC。他們家的文字直播(live text)總是神神祕祕的,用的是hash網址,而且每次直播完畢以後進入該文字直播的連結就會被拔掉。也就是說如果直播時你沒進去的話,以後想進去就只能靠google替你挖出來。這不是很麻煩嗎?有必要如此低調嗎?你說政治類直播藏起來免得日後打臉就算了,怎麼你連體育直播也藏呢?

然後大統領放話還有下一個。格陵蘭、哥倫比亞、古巴……

嗯?古巴?

*

我對古巴的印象並不多,能舉出來的只有四項。古巴飛彈危機當然是其一,我家附近那成群的雪茄店是第二。還有兩個都是網上影片:第三是古巴人自己搭內聯網打CS,還有第四就是紐約某家古巴中菜。那個西班牙文名字我不太會讀,但它的中文名字挺好認的,那就是觀音樓。

作為一個與拉美沒甚麼緣分的人來說,要理解這個地方的文化屬實不易。萬幸的是食物是無分國界的語言,只要看一眼嚐一口就能懂了。你或許吃不慣那些混進一堆豆類和大蕉的菜式,但他們家的炒麵--應該叫雜碎--卻是全世界的中式快餐都能找到的東西。不是現在那些到處擴張的預制中式連鎖垃圾,而是那些移民家庭在外地開的,給「鬼佬」做的中菜。

有時候工作跑到陌生的鄉下城市,獨自一人實在沒心情吃甚麼好的西餐,外賣壽司也吃膩了,我就會跑來這種地方點個炒麵加個檸檬雞。在後廚炒菜的是老爸和兒子,收銀的是老媽,還有小孩從廚房連接的主宅跑出來。你接過外賣,炒麵不出意外是炒得很油像是上海粗炒的麵條,夾著幾片叉燒,還有紅蘿蔔和西蘭花。檸檬雞是最不中菜的中菜,上了厚粉炸的雞腿肉加上檸檬醬--當然不是大陸西南用的傳統檸檬醬,而是廉價的、用檸檬汁開的糖漿,說是弱化版咕嚕肉都是一種抬舉。這些作為中菜水準實屬一般,你在意的卻是食物以外的滿足。

嘛,如果有機會我還能多寫我對外國中餐館的回憶感覺。其實觀音樓更靠近餐廳而非專做takeaway的中式快餐,但我莫名奇妙就對這我從未光顧過的小菜館有著好感。

為甚麼我會突然記起它呢?其實它在經營三四十年後頂不住疫情倒了,每次回去看那影片都能看到下面的留言在懷緬。可這次不同了--觀音樓在原址四街之隔重開,看起來也是原班人馬,看起來還有本事把古巴中菜這歷史偶然產生的神奇菜系延續下去一段時間。如果有讀者有幸去吃一次的話,歡迎留言讓我知道吃起來怎麼樣!

*

俄佬沒法速通烏克蘭不要緊,至少我們還有遊戲速通可以看。每年的AGDQ又來了,第一集網絡隨心巡記本來就是因為AGDQ而寫的。結果這次有興趣看的意外的少呢:

- dlroW oiraM repuS是倒序打通SMW的mod。按理說正常打不會太難,但這個遊戲是典型打起來不難、速通起來超難的遊戲。如果有SMW的經驗的話蠻值得一看的。
- Bloons Tower Defense 6。原祖級塔防系列,但總覺得但凡加太多複雜機制的話就會被找出一種很好通關的作業,這正是我看完這段示範後的感想。還是簡單的比較好玩,專注於輸出穩定性和怪物血量之間的平衡就夠了。

怎麼說呢。也不是覺得速通沒意思了才不看,而是真的剛好全部都沒興趣。Switch 2首發沒關注,老任舊作像是賽車和Metroid也是我剛好不玩的東西(以前還會因為kill the animals環節而看,現在已經沒人在乎這個梗了),一些定番像是2D索尼克和super monkey ball也沒出場,小眾遊戲新作也沒幾個吸引人。壓軸的綠寶石自選一隻速通是讓(有投錢的)觀眾很有參與感沒錯,但通關方式其實大同小異,只要是熟知遊戲的人大概都能打出差不多的成績。更氣人的遊戲打到一半被火警鐘打斷了!以一個反高潮的環節來給這個無甚記憶點的活動來收尾,還真是蠻匹配的,嗯。

Speedrunning歷史達人兼Punch Out!最快打敗泰臣紀錄保持者summoningsalt最近發了部影片講他對「speedrunning已死」的感想。他先看了google搜尋speedrunning、AGDQ、加上他自己影片的熱度,那是人們的主要根據;他自己則提出了其他測量法,認為整體熱度還在上升,只不過沒像疫情時那爆發式增長,而流量分散到新興活動、遊戲和創作者也會造成整體熱度停滯不前的樣子。他為速通已經「成熟」了,有穩定的社群和流量,當然其中也有他的一份大蛋糕。

道理大家都知道,但我想說說成熟這一點。

「成熟」這個概念在這博客被反覆提起。比如說,社群對一個遊戲的理解怎樣才算是成熟了?對我來說最重要的是不用再盲目摸索,而是建立系統化、數值化的解構。能找出一個可以應對遊戲現有機制、甚至迅速看懂更新改版的理論,那對遊戲的理解也就成熟了。我印象最深刻的當然是我桐遊戲理論的探索過程

說起來,最近頁遊好像又復興了一些。西方那邊接連冒出幾個放置系RPG,中文圈則有江湖系、修仙系、還有OGame系的……至少這次不會被半夜被屠了。

把同一套理論搬到速通的理解上,所謂成熟就是能迅速理解速通一隻新遊戲的脈絡的能力。最表面的當然是「正攻」遊戲所需要的技巧和手速。凡是RPG就離不開數值規劃,有選擇就是流程規劃;有劇情就看能不能跳過,有時間就有RNG控制,內存溢出就能刷ACE;動作遊戲裡物理模型怎樣判定和計算有沒有漏洞等等。哪些技巧比較簡單、哪些不夠穩定;哪些會在早期被用上,哪些會留到最後才嘗試。

只要有了脈絡,速通的進展會變得有序和線性,就像朝著一個目標挖礦一樣。至於能不能挖下去,能挖到多深,那只取決於有沒有人來挖,挖的人技術有多好。現在一個新遊戲只要多通關幾遍,該怎樣把時間砍下去甚麼了已經了然於胸,剩下的就只是執行而已。

比如說,瑪莉歐賽車64的weathertanko技巧,一般來說成功率只有5%。如果想要三圈都成功使出來的話就要賭那八千分之一的運氣。可是這不也是被weathertanko本人試了一萬多次硬生生撞破了嗎?Bismuth三個月前發佈新影片剖析了完美速通SMB1(也就是打和TAS)的難度。他把非常有可能破紀錄、長期霸榜的Niftski在每一個難關、每一步的成功率拿出來算,算出機率大約在六萬五分一左右,如果他技術繼續進步則可能到兩萬五分一左右。雖然比weathertanko連三拉三難上一個檔次而且遊戲也更長,可是這不也是機率問題嗎?問就是多打幾次、多打幾個月、多打幾年。

你說不對啊。Bismuth他四年前還發過影片說4:54(指4:55以下的時間)為何是完美speedrun,七年前發影片說某幾個關卡的TAS framerule分別是"1/1000"、"maybe someday"和"not happening"。對比現在一直到8-4都能跟TAS平手、最終成績4:54.4的時間不是好多了,這不是理論進步的表現嗎?

還真不是。一方面是來了個高超的挖礦者把成長曲線提前了,另一方面這也是局外人對特定技巧難度的誤判。當初被說成"not happening"的8-2 TAS framerule,Niftski本人的成功率是36%。現在可以4:54完成遊戲的人已經一隻手數不完了,幾乎每個人都能大致穩定地打過8-2。所謂的難關在人才的堆疊下還是


這更像是來了個高超的挖礦者,把成長曲線提前了而已。人手操作下的SMB1攻略已經放這裡很多年了,只是沒人練到可以執行的地步而已。Niftski本人在8-2的成功率是36%,可不是甚麼not happening。你以為是他一個人太厲害嗎?其實現在能4:54通關的人已經一隻手數不完了,這些人打8-2的成功率也不可能太低。在天賦和努力的堆疊下許多打法並沒有外人想象中那麼不可能。

所以說,當軌跡已經定下的時候,速通能帶來的驚喜好像就減少了很多。

這是我的一面之詞就是。

我到現在都偶而會看Werster直播綠寶石戰鬥工廠的挑戰。比起其他講求手速和精密度的速通挑戰,這更像是理論和運氣的結合。理論是很重要,但如果不給你好用的精靈,又或者把剋你的隊伍擺在你面前,甚至只在關鍵時候給你來個關鍵一擊,你還不要重頭來過?每一場、每一輪都像是在丟骰子。從這個角度看,Werster的挑戰不過是另一種形式的丟骰子而已。不過我想觀眾想看的是到他被坑死時暴怒模仿精靈尖叫的樣子,又或者僥幸通關時那種野性的喜悅吧?就像他一口氣打完七金的時候那樣。

喔對了,TGM4的最高成就GM King of Rounds也終於被攻破了。戰術從發現1300+的冰凍原理後就沒怎樣變過,變的只有熟練度和進取程度,剩下的就是運氣了。不得不說這次通關還是十分漂亮,尤其最後幾秒在手抖差點前功盡棄下仍然達陣實在賺盡掌聲……不過也代表65秒拼1300行其實還不是人類的極限吧?嗯,這種反人類的事留給Arika想就好了。

*

我打從開始就在猶豫這篇應該屬於網絡隨心巡記還是雜談。這是三個有明確順序、兩兩緊密關連的題目。另一方面作為隨心的記錄這又有點冗長,三個題目也有點太少,何況部份題目談的不是發生在當下而是橫跨一段時間的事情,這似乎有違隨心巡記的格式。

看看那四千字的長度,還是當成雜談好了。反正不論是網絡隨心巡記還是雜談,標籤都都屬於「隨筆」不是嗎?