mathjax.md 5.1 KB

MathJax

Delimiters

Delimiter Format Expression Result Support
No delimiters str \sqrt{3x-1}+(1+x)^2 \sqrt{3x-1}+(1+x)^2 no
Bracket without backslash [str] [\sqrt{3x-1}+(1+x)^2] [\sqrt{3x-1}+(1+x)^2] no
Single backslash with bracket \[str\] \[\sqrt{3x-1}+(1+x)^2\] [\sqrt{3x-1}+(1+x)^2] yes
Double backslash with bracket \\[str\\] \\[\sqrt{3x-1}+(1+x)^2\\] \[\sqrt{3x-1}+(1+x)^2\] no
Parentheses without backslash (str) (\sqrt{3x-1}+(1+x)^2) (\sqrt{3x-1}+(1+x)^2) no
Single backslash with parentheses \(str\) \(\sqrt{3x-1}+(1+x)^2\) (\sqrt{3x-1}+(1+x)^2) yes
Double backslash with parentheses \\(str\\) \\(\sqrt{3x-1}+(1+x)^2\\) \(\sqrt{3x-1}+(1+x)^2\) no
Single dollar sign $str$ $\sqrt{3x-1}+(1+x)^2$ $\sqrt{3x-1}+(1+x)^2$ yes
Double dollar sign $$str$$ $$\sqrt{3x-1}+(1+x)^2$$ $$\sqrt{3x-1}+(1+x)^2$$ yes

Empty

  • \(\) ()
  • $$ $$
  • \[\] []
  • $$$$ $$$$

Single Character

  • \(a\) (a)
  • $a$ $a$
  • \[a\] [a]
  • $$a$$ $$a$$

Multiple on single line

  • \(a\) (a) \(b\) (b)
  • $a$ $a$ $b$ $b$
  • \[a\] [a] \[b\] [b]
  • $$a$$ $$a$$ $$b$$ $$b$$

Underscore _

\( single line \)

\(x_i = x_\gamma\) (xi = x\gamma)

\( multiline \)

\(
x_i = x_\gamma
\)

( xi = x\gamma )


\[ single line \]

\[x_i = x_\gamma\] [xi = x\gamma]

\[ multiline \]

\[
x_i = x_\gamma
\]

[ xi = x\gamma ]


$ single line $

$x_i = x_\gamma$ $xi = x\gamma$

$ multiline $ Not Supported!

$
x_i = x_\gamma
$

$ xi = x\gamma $


$$ single line $$

$$x_i = x_\gamma$$ $$xi = x\gamma$$

$$ multiline $$

$$
x_i = x_\gamma
$$

$$ xi = x\gamma $$


\begin{} multiline \end{}

\begin{align}
x_i = x_\gamma
\end{align}

\begin{align} xi = x\gamma \end{align}


Escapes

Dollar Sign

\$6.20 and \$0.5 \$6.20 and \$0.5

$4.40 $4.40


Examples

Using TeX notation

When $a \ne 0$, there are two solutions to (ax^2 + bx + c = 0) and they are $$x = {-b \pm \sqrt{b^2-4ac} \over 2a}.$$


Several examples of TeX equations

The Lorenz Equations

\begin{align} \dot{x} & = \sigma(y-x) \ \dot{y} & = \rho x - y - xz \ \dot{z} & = -\beta z + xy \end{align}

The Cauchy-Schwarz Inequality

[ \left( \sum_{k=1}^n a_k bk \right)^{!!2} \leq \left( \sum{k=1}^n ak^2 \right) \left( \sum{k=1}^n b_k^2 \right) ]

A Cross Product Formula

[ \mathbf{V}_1 \times \mathbf{V}_2 = \begin{vmatrix}

\mathbf{i} & \mathbf{j} & \mathbf{k} \\
\frac{\partial X}{\partial u} & \frac{\partial Y}{\partial u} & 0 \\
\frac{\partial X}{\partial v} & \frac{\partial Y}{\partial v} & 0 \\

\end{vmatrix} ]

The probability of getting (k) heads when flipping (n) coins is:

[P(E) = {n \choose k} p^k (1-p)^{ n-k} ]

An Identity of Ramanujan

[ \frac{1}{(\sqrt{\phi \sqrt{5}}-\phi) e^{\frac25 \pi}} =

 1+\frac{e^{-2\pi}} {1+\frac{e^{-4\pi}} {1+\frac{e^{-6\pi}}
  {1+\frac{e^{-8\pi}} {1+\ldots} } } }

]

A Rogers-Ramanujan Identity

[ 1 + \frac{q^2}{(1-q)}+\frac{q^6}{(1-q)(1-q^2)}+\cdots =

\prod_{j=0}^{\infty}\frac{1}{(1-q^{5j+2})(1-q^{5j+3})},
 \quad\quad \text{for $|q|<1$}.

]

Maxwell's Equations

\begin{align} \nabla \times \vec{\mathbf{B}} -\, \frac1c\, \frac{\partial\vec{\mathbf{E}}}{\partial t} & = \frac{4\pi}{c}\vec{\mathbf{j}} \ \nabla \cdot \vec{\mathbf{E}} & = 4 \pi \rho \ \nabla \times \vec{\mathbf{E}}\, +\, \frac1c\, \frac{\partial\vec{\mathbf{B}}}{\partial t} & = \vec{\mathbf{0}} \ \nabla \cdot \vec{\mathbf{B}} & = 0 \end{align}

In-line Mathematics

Finally, while display equations look good for a page of samples, the ability to mix math and text in a paragraph is also important. This expression (\sqrt{3x-1}+(1+x)^2) is an example of an inline equation. As you see, MathJax equations can be used this way as well, without unduly disturbing the spacing between lines.


Misc

  • $E = mc^2$

  • ( A_i = B_i + Ci \sum{k=0}^{i} D_k E^k )

  • \begin{eqnarray} A_i &=& B_i + Ci \sum{k=0}^{i} D_k E^k \ Fi &=& \int{-\infty}^{x_i} f(x) dx \end{eqnarray}

  • $\frac{w_x}{\sum_z x_z}$

  • $\frac{w}{\sum_{z} x_z}$

  • $x_\gamma = x_i$

  • $xi = x\gamma$

Cost function of logistic regression (revision):

$$J(\theta) = - \frac{1}{m} \sum{i=1}^m [ y^{(i)}\ \log (h\theta (x^{(i)})) + (1 - y^{(i)})\ \log (1 - h\theta(x^{(i)}))] + \frac{\lambda}{2m}\sum{j=1}^n \theta_j^2$$

For Neural Networks, it is:

$$ J(\Theta) = - \frac{1}{m} \sum{i=1}^m \sum{k=1}^K \left[y^{(i)}k \log ((h\Theta (x^{(i)}))_k) + (1 - y^{(i)}k)\log (1 - (h\Theta(x^{(i)}))k)\right] + \frac{\lambda}{2m}\sum{l=1}^{L-1} \sum_{p=1}^{sl} \sum{n=1}^{s{l+1}} ( \Theta{n,p}^{(l)})^2 $$