You're missing the slash (/) in ΔS=ΔQ/T.
There are other formulations for entropy due to, for example, von Neumann, Gibbs*, Shannon, Bekenstein and Hawking, and, most famously, Boltzmann, which is inscribed on his tombstone:
S = k.log W
where k is Boltzmann's constant and W is the number of microstates in a system. In statistical thermodynamics, one can interpret entropy as a measure of the lack of knowledge about a system's internal configuration. For small enough systems, I believe spontaneous entropy reduction, which looks like time reversal, has been observed, confirming that the second law is indeed probabilistic. However, the probability of Humpty Dumpty spontaneously reassembling and leaping back onto the wall is vanishingly small.
Boltzmann took his own life and some ascribe his preceding depression to not only undiagnosed bipolar disorder, but also realising that he could not eliminate circular reasoning when he tried to explain time in terms of entropy.
The arrow of time does not arise from equations - it's something we deduce - and our brains are driven by increasing entropy that appears to be correlated with the expansion of the universe. Its entropy is potentially limited by the area of the observable horizon, although some solutions of the Einstein field equations of General Relativity suggest there might exist even higher entropy regions within - J A Wheeler's so-called "bags of gold".
A few decades ago, I remember reading a suggestion that equations about entropy, such as its rate of development over time, remain to be formulated. However, I don't know if anything came of that.
A more current development is Wolfram's ruliad** approach, which he claims can explain entropy, time and a whole lot besides. I don't understand it well enough to discern whether it also falls into a circular logical fallacy.
ETA:
* Gibbs' definition of entropy can be applied to a system far from thermal equilibrium. Other equations for entropy assume that a system is in thermal equilibrium.
**
The Concept of the Ruliad