Skip to content

Commit b530d69

Browse files
committed
fix typos in documentation and code comments as well as code
1 parent 65416d0 commit b530d69

File tree

15 files changed

+25
-25
lines changed

15 files changed

+25
-25
lines changed

CHANGELOG.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
# v0.11.6 (Upcoming Release)
22

33
- Update citation of `robhatreg` a.k.a Robust Hat Matrix based Regression Estimator
4-
4+
- Fix typos in code, code comments, and documentation
55

66
# v0.11.5
77

@@ -107,7 +107,7 @@
107107
# v0.9.0
108108

109109
- Add exact argument for LAD. If exact is true then the linear programming based exact solution is found. Otherwise, a GA based search is performed to yield approximate solutions.
110-
- Remove dependency of Plots.jl. If Plots.jl is installed and loaded manually, the functionality that uses Plot is autmatically loaded by Requires.jl. Affected functions are `dataimage`, `mveltsplot`, and `bchplot`.
110+
- Remove dependency of Plots.jl. If Plots.jl is installed and loaded manually, the functionality that uses Plot is automatically loaded by Requires.jl. Affected functions are `dataimage`, `mveltsplot`, and `bchplot`.
111111

112112

113113
# v0.8.19

CONTRIBUTING.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,4 @@
1-
## welcome, contributer!
1+
## welcome, contributor!
22

33
Please read the [pull requests](https://docs.github.com/en/github/collaborating-with-issues-and-pull-requests/about-pull-requests)
4-
section in GitHub before preparing any feauture for this library.
4+
section in GitHub before preparing any feature for this library.

docs/src/algorithms.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -130,7 +130,7 @@ LinRegOutliers.cm97
130130
LinRegOutliers.quantileregression
131131
```
132132

133-
## Theil-Sen estimator for multiple regresion
133+
## Theil-Sen estimator for multiple regression
134134
```@docs
135135
LinRegOutliers.theilsen
136136
```

examples.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ where *β₀* and *β₁* are unknown intercept and slope parameters. In `R` and
1919
@formula(y ~ x)
2020
```
2121

22-
where ```~``` operator seperates the dependent and independent variables. When the model includes more than one regressors, the model can similarly be expressed as
22+
where ```~``` operator separates the dependent and independent variables. When the model includes more than one regressors, the model can similarly be expressed as
2323

2424
```julia
2525
@formula(y ~ x1 + x2 + x3)
@@ -119,7 +119,7 @@ defines color using the Euclidean distance, whereas
119119
120120
uses Mahalanobis distances for determining color values. The default distance metric is Euclidean distance.
121121
122-
In the example below, the distances between observations are calculated and drawn using corresponding colors. Since the method is for multivariate data, only the desing matrix is used. In other terms, the response vector is omitted.
122+
In the example below, the distances between observations are calculated and drawn using corresponding colors. Since the method is for multivariate data, only the design matrix is used. In other terms, the response vector is omitted.
123123
124124
```julia
125125
julia> # Matrix of independent variables of Hawkins & Bradu & Kass data

src/bacon.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -55,7 +55,7 @@ It also guarantees that at least m indices are returned and that the selected in
5555
# Arguments
5656
- `X`: The multivariate matrix where each row is a data point.
5757
- `m`: The minimum number of points to include in the subset indices.
58-
- `distances`: The distances vector used for selecting minumum distance indices.
58+
- `distances`: The distances vector used for selecting minimum distance indices.
5959
"""
6060
function select_subset(X::AbstractMatrix{Float64}, m::Int, distances::Array{Float64})
6161
rank_x = rank(X)
@@ -242,7 +242,7 @@ Run the BACON algorithm to detect outliers on regression data.
242242
- `alpha`: The quantile used for cutoff
243243
244244
# Description
245-
BACON (Blocked Adaptive Computationally efficient Outlier Nominators) algoritm, defined in the citation below,
245+
BACON (Blocked Adaptive Computationally efficient Outlier Nominators) algorithm, defined in the citation below,
246246
has many versions, e.g BACON for multivariate data, BACON for regression etc. Since the design matrix of a
247247
regression model is multivariate data, BACON for multivariate data is performed in early stages of the algorithm.
248248
After selecting a clean subset of observations, then a forward search is applied. Observations with high

src/basis.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -121,7 +121,7 @@ end
121121
122122
designMatrix(setting)
123123
124-
Return matrix of independent variables including the variable (ones) of the constanst term for a given regression setting.
124+
Return matrix of independent variables including the variable (ones) of the constants term for a given regression setting.
125125
126126
# Arguments
127127
- `setting::RegressionSetting`: A regression setting object.

src/deepestregression.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ using mrfDepth_jll: mrfDepth_jll
1111
"""
1212
deepestregression(setting; maxit = 1000)
1313
14-
Estimate Deepest Regression paramaters.
14+
Estimate Deepest Regression parameters.
1515
1616
1717
# Arguments

src/ga.jl

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -152,12 +152,12 @@ function ga(
152152
maxs::AbstractVector{Float64},
153153
pcross::Float64,
154154
pmutate::Float64,
155-
elitisim::Int,
155+
elitism::Int,
156156
iterations::Int,
157157
)::Array{RealChromosome,1}
158158
pop = createPopulation(popsize, chsize, mins, maxs)
159159
for _ = 1:iterations
160-
pop = Generation(pop, fcost, elitisim, pcross, pmutate)
160+
pop = Generation(pop, fcost, elitism, pcross, pmutate)
161161
end
162162
return pop
163163
end

src/hadi1994.jl

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ Dict{Any,Any} with 3 entries:
4545
```
4646
4747
# Reference
48-
Hadi, Ali S. "A modification of a method for the dedection of outliers in multivariate samples"
48+
Hadi, Ali S. "A modification of a method for the detection of outliers in multivariate samples"
4949
Journal of the Royal Statistical Society: Series B (Methodological) 56.2 (1994): 393-396.
5050
"""
5151
function hadi1994(multivariateData::AbstractMatrix{Float64}; alpha = 0.05)

src/lms.jl

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -68,7 +68,7 @@ function lms(X::AbstractMatrix{Float64}, y::AbstractVector{Float64}; iters = not
6868
iters = minimum([500 * p, 3000])
6969
end
7070
bestobjective = Inf
71-
bestparamaters = Array{Float64}(undef, p)
71+
bestparameters = Array{Float64}(undef, p)
7272
bestres = Array{Float64}(undef, n)
7373
indices = collect(1:n)
7474
kindices = collect(p:n)
@@ -83,7 +83,7 @@ function lms(X::AbstractMatrix{Float64}, y::AbstractVector{Float64}; iters = not
8383
res = sort!((y .- X * betas) .^ 2.0)
8484
m2 = res[h]
8585
if m2 < bestobjective
86-
bestparamaters = betas
86+
bestparameters = betas
8787
bestobjective = m2
8888
bestres = y .- X * betas
8989
end
@@ -94,7 +94,7 @@ function lms(X::AbstractMatrix{Float64}, y::AbstractVector{Float64}; iters = not
9494
s = 1.4826 * sqrt((1.0 + (5.0 / (n - p))) * bestobjective)
9595
standardizedres = bestres / s
9696
d = Dict{String, Any}()
97-
d["betas"] = bestparamaters
97+
d["betas"] = bestparameters
9898
d["objective"] = bestobjective
9999
d["S"] = s
100100
d["stdres"] = standardizedres

0 commit comments

Comments
 (0)