Skip to content

2.1. Calculation

Directory: specular/calculation/

The specular.calculation module provides five primary functions to calculate specular differentiation, depending on the dimension of input.

Function Space Description Input Type Output Type
derivative \(\mathbb{R} \to \mathbb{R}^m\) specular derivative float float, np.ndarray
directional_derivative \(\mathbb{R}^n \to \mathbb{R}\) specular directional derivative in direction \(v \in \mathbb{R}^n\) np.ndarry float
partial_derivative \(\mathbb{R}^n \to \mathbb{R}\) specular partial derivative w.r.t. \(v = x_i\) np.ndarray float
gradient \(\mathbb{R}^n \to \mathbb{R}\) specular gradient vector np.ndarray np.ndarray
jacobian \(\mathbb{R}^n \to \mathbb{R}^m\) specular jacobian matrix np.ndarray np.ndarray

2.1.1. One-dimensional Euclidean Space (\(n=1\))

In \(ℝ\), the specular derivative can be calculated using the function derivative.

import specular

def f(x):
    return max(x, 0.0)

print(specular.derivative(f, x=0.0))
0.41421356237309515

2.1.2. the \(n\)-dimensional Euclidean space (\(n>1\))

In \(ℝ^n\), the specular directional derivative of a function \(f: ℝ^n \to ℝ\) at a point \(x \in ℝ^n\) in the direction \(v \in ℝ^n\) can be calculated using the function directional_derivative.

import specular
import math 

f = lambda x: math.sqrt(x[0]**2 + x[1]**2 + x[2]**2)
print(specular.directional_derivative(f, x=[0.0, 0.1, -0.1], v=[1.0, -1.0, 2.0]))
-2.1213203434708223

Let \(e_1, e_2, \ldots, e_n\) be the standard basis of \(ℝ^n\). For each \(i \in ℕ\) with \(1 \leq i \leq n\), the specular partial derivative with respect to a variable \(x_i\) can be calculated using the function partial_derivative, which yields the same result as directional_derivative with direction \(v=e_i\).

import specular
import math

def f(x):
    return math.sqrt(x[0]**2 + x[1]**2 + x[2]**2)

print(specular.partial_derivative(f, x=[0.1, 2.3, -1.2], i=2))
print(specular.directional_derivative(f, x=[0.1, 2.3, -1.2], v=[0.0, 1.0, 0.0]))
0.8859268982863702
0.8859268982863702

Also, the specular gradient can be calculated using gradient.

import specular
import numpy as np

def f(x):
    return np.linalg.norm(x)

print(specular.gradient(f, x=[0.1, 2.3, -1.2]))
print(specular.partial_derivative(f, x=[0.1, 2.3, -1.2], i=1))
print(specular.partial_derivative(f, x=[0.1, 2.3, -1.2], i=2))
print(specular.partial_derivative(f, x=[0.1, 2.3, -1.2], i=3))
[ 0.03851856  0.8859269  -0.46222273]
0.03851856078540371
0.8859268982863702
-0.4622227292028128

2.1.3. API Reference

specular.calculation

This module provides implementations of specular directional derivatives, specular partial derivatives, specular derivatives, specular gradients, and specular Jacobians.

A(alpha, beta, zero_tol=1e-08, quasi_Fermat=False, monotonicity=False)

Compute the specular function A(alpha, beta).

Examples:

>>> import specular
>>> specular.A(1.0, 2.0)
1.3874258867227933
Source code in specular\calculation.py
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
def A(
    alpha: "float | np.number | int | np.ndarray",
    beta: "float | np.number | int | np.ndarray",
    zero_tol: float = 1e-8,
    quasi_Fermat: bool = False,
    monotonicity: bool = False,
) -> "float | np.ndarray | list[float] | list[np.ndarray]":
    """Compute the specular function A(alpha, beta).

    Examples:
        >>> import specular
        >>> specular.A(1.0, 2.0)
        1.3874258867227933
    """
    return _get_backend_module().A(
        alpha, beta, zero_tol, quasi_Fermat, monotonicity
    )

derivative(f, x, h=1e-06, zero_tol=1e-08, quasi_Fermat=False, monotonicity=False)

Approximate the specular derivative of f at scalar x.

Examples:

>>> import specular
>>> f = lambda x: abs(x)
>>> specular.derivative(f, x=0.0)
0.0
Source code in specular\calculation.py
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
def derivative(
    f: "Callable[[int | float | np.number], int | float | np.number | list | np.ndarray]",
    x: ArrayLike,
    h: float = 1e-6,
    zero_tol: float = 1e-8,
    quasi_Fermat: bool = False,
    monotonicity: bool = False
) -> "float | np.ndarray | list[float] | list[np.ndarray]":
    """Approximate the specular derivative of f at scalar x.

    Examples:
        >>> import specular
        >>> f = lambda x: abs(x)
        >>> specular.derivative(f, x=0.0)
        0.0
    """
    if h <= 0:
        raise ValueError(f"Mesh size 'h' must be positive. Got {h}")

    return _get_backend_module().derivative(
        f, x, h, zero_tol, quasi_Fermat, monotonicity
    )

directional_derivative(f, x, v, h=1e-06, zero_tol=1e-08)

Approximate the specular directional derivative of f at x in direction v.

Source code in specular\calculation.py
79
80
81
82
83
84
85
86
87
88
89
90
def directional_derivative(
    f: "Callable[[list | np.ndarray], int | float | np.number]",
    x: ArrayLike,
    v: ArrayLike,
    h: float = 1e-6,
    zero_tol: float = 1e-8
) -> float:
    """Approximate the specular directional derivative of f at x in direction v."""
    if h <= 0:
        raise ValueError(f"Mesh size 'h' must be positive. Got {h}")

    return _get_backend_module().directional_derivative(f, x, v, h, zero_tol)

gradient(f, x, h=1e-06, zero_tol=1e-08, quasi_Fermat=False, monotonicity=False)

Approximate the specular gradient of f at x.

Examples:

>>> import specular
>>> import numpy as np
>>> f = lambda x: np.linalg.norm(x)
>>> specular.gradient(f, x=[1.4, -3.47, 4.57, 9.9])
array([ 0.12144298, -0.3010051 ,  0.39642458,  0.85877534])
Source code in specular\calculation.py
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
def gradient(
    f: "Callable[[list | np.ndarray], int | float | np.number]",
    x: ArrayLike,
    h: float = 1e-6,
    zero_tol: float = 1e-8,
    quasi_Fermat: bool = False,
    monotonicity: bool = False
) -> "np.ndarray | List[np.ndarray]":
    """Approximate the specular gradient of f at x.

    Examples:
        >>> import specular
        >>> import numpy as np
        >>> f = lambda x: np.linalg.norm(x)
        >>> specular.gradient(f, x=[1.4, -3.47, 4.57, 9.9])
        array([ 0.12144298, -0.3010051 ,  0.39642458,  0.85877534])
    """
    if h <= 0:
        raise ValueError(f"Mesh size 'h' must be positive. Got {h}")

    return _get_backend_module().gradient(
        f, x, h, zero_tol, quasi_Fermat, monotonicity
    )

jacobian(f, x, h=1e-06, zero_tol=1e-08, quasi_Fermat=False, monotonicity=False)

Approximate the specular Jacobian of f at x, shape (m, n).

Source code in specular\calculation.py
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
def jacobian(
    f: "Callable[[list | np.ndarray], int | float | np.number | list | np.ndarray]",
    x: ArrayLike,
    h: float = 1e-6,
    zero_tol: float = 1e-8,
    quasi_Fermat: bool = False,
    monotonicity: bool = False
) -> "np.ndarray | List[np.ndarray]":
    """Approximate the specular Jacobian of f at x, shape (m, n)."""
    if h <= 0:
        raise ValueError(f"Mesh size 'h' must be positive. Got {h}")

    return _get_backend_module().jacobian(
        f, x, h, zero_tol, quasi_Fermat, monotonicity
    )

partial_derivative(f, x, i, h=1e-06, zero_tol=1e-08)

Approximate the i-th specular partial derivative of f at x (1-indexed).

Source code in specular\calculation.py
 93
 94
 95
 96
 97
 98
 99
100
101
102
103
104
def partial_derivative(
    f: "Callable[[list | np.ndarray], int | float | np.number]",
    x: ArrayLike,
    i: "int | np.integer",
    h: float = 1e-6,
    zero_tol: float = 1e-8
) -> float:
    """Approximate the i-th specular partial derivative of f at x (1-indexed)."""
    if h <= 0:
        raise ValueError(f"Mesh size 'h' must be positive. Got {h}")

    return _get_backend_module().partial_derivative(f, x, i, h, zero_tol)