Last updated on 2023-12-10 04:56:03 CET.
Flavor | Version | Tinstall | Tcheck | Ttotal | Status | Flags |
---|---|---|---|---|---|---|
r-devel-linux-x86_64-debian-clang | 1.5.1 | 494.73 | 105.07 | 599.80 | ERROR | |
r-devel-linux-x86_64-debian-gcc | 1.5.1 | 401.15 | 81.01 | 482.16 | NOTE | |
r-devel-linux-x86_64-fedora-clang | 1.5.1 | 927.71 | WARN | |||
r-devel-linux-x86_64-fedora-gcc | 1.5.1 | 1012.76 | NOTE | |||
r-devel-windows-x86_64 | 1.5.1 | 2.00 | 4.00 | 6.00 | ERROR | |
r-patched-linux-x86_64 | 1.5.1 | 487.83 | 107.37 | 595.20 | NOTE | |
r-release-linux-x86_64 | 1.5.1 | 485.46 | 109.46 | 594.92 | NOTE | |
r-release-macos-arm64 | 1.5.1 | 250.00 | NOTE | |||
r-release-macos-x86_64 | 1.5.1 | 405.00 | NOTE | |||
r-release-windows-x86_64 | 1.5.1 | 3.00 | 5.00 | 8.00 | ERROR | |
r-oldrel-macos-arm64 | 1.5.1 | 243.00 | NOTE | |||
r-oldrel-macos-x86_64 | 1.5.1 | 374.00 | NOTE | |||
r-oldrel-windows-x86_64 | 1.5.1 | 516.00 | 200.00 | 716.00 | OK |
Version: 1.5.1
Check: whether package can be installed
Result: WARN
Found the following significant warnings:
ClusterLauncher.cpp:201:16: warning: format string is not a string literal (potentially insecure) [-Wformat-security]
ClusterLauncher.cpp:291:16: warning: format string is not a string literal (potentially insecure) [-Wformat-security]
KmmLauncher.cpp:283:16: warning: format string is not a string literal (potentially insecure) [-Wformat-security]
KmmLauncher.cpp:392:16: warning: format string is not a string literal (potentially insecure) [-Wformat-security]
LearnLauncher.cpp:202:16: warning: format string is not a string literal (potentially insecure) [-Wformat-security]
LearnLauncher.cpp:289:16: warning: format string is not a string literal (potentially insecure) [-Wformat-security]
See ‘/home/hornik/tmp/R.check/r-devel-clang/Work/PKGS/MixAll.Rcheck/00install.out’ for details.
* used C compiler: ‘Debian clang version 17.0.5 (1)’
* used C++ compiler: ‘Debian clang version 17.0.5 (1)’
Flavor: r-devel-linux-x86_64-debian-clang
Version: 1.5.1
Check: C++ specification
Result: NOTE
Specified C++11: please drop specification unless essential
Flavors: r-devel-linux-x86_64-debian-clang, r-devel-linux-x86_64-debian-gcc, r-devel-linux-x86_64-fedora-clang, r-devel-linux-x86_64-fedora-gcc, r-patched-linux-x86_64, r-release-linux-x86_64, r-release-macos-arm64, r-release-macos-x86_64
Version: 1.5.1
Check: Rd files
Result: NOTE
checkRd: (-1) clusterAlgo.Rd:26: Lost braces; missing escapes or markup?
26 | \item \code{EM} {The Expectation Maximisation algorithm}
| ^
checkRd: (-1) clusterAlgo.Rd:27: Lost braces; missing escapes or markup?
27 | \item \code{CEM} {The Classification EM algorithm}
| ^
checkRd: (-1) clusterAlgo.Rd:28: Lost braces; missing escapes or markup?
28 | \item \code{SEM} {The Stochastic EM algorithm}
| ^
checkRd: (-1) clusterAlgo.Rd:29: Lost braces; missing escapes or markup?
29 | \item \code{SemiSEM} {The Semi-Stochastic EM algorithm}
| ^
checkRd: (-1) clusterAlgo.Rd:33: Lost braces; missing escapes or markup?
33 | \item \code{nbIteration} {Set the maximum number of iterations.}
| ^
checkRd: (-1) clusterAlgo.Rd:34: Lost braces; missing escapes or markup?
34 | \item \code{epsilon} {Set relative increase of the log-likelihood criterion.}
| ^
checkRd: (-1) clusterInit.Rd:34: Lost braces; missing escapes or markup?
34 | \item \code{random} {The initial parameters of the mixture are chosen randomly.}
| ^
checkRd: (-1) clusterInit.Rd:35: Lost braces; missing escapes or markup?
35 | \item \code{class} {The initial membership of individuals are sampled randomly.}
| ^
checkRd: (-1) clusterInit.Rd:36-37: Lost braces
36 | \item \code{fuzzy} {The initial probabilities of membership of individuals are
| ^
checkRd: (-1) learnAlgo.Rd:27: Lost braces; missing escapes or markup?
27 | \item \code{Impute} {Impute the missing values during the iterations}
| ^
checkRd: (-1) learnAlgo.Rd:28: Lost braces; missing escapes or markup?
28 | \item \code{Simul} {Simulate the missing values during the iterations}
| ^
checkRd: (-1) learnAlgo.Rd:32: Lost braces; missing escapes or markup?
32 | \item \code{nbIteration} {Set the maximum number of iterations.}
| ^
checkRd: (-1) learnAlgo.Rd:33: Lost braces; missing escapes or markup?
33 | \item \code{epsilon} {Set relative increase of the log-likelihood criterion.}
| ^
Flavors: r-devel-linux-x86_64-debian-clang, r-devel-linux-x86_64-debian-gcc
Version: 1.5.1
Check: tests
Result: ERROR
Running ‘ClusterSimul.R’ [0s/1s]
Running ‘clusterDiagGaussianLikelihood.R’ [1s/2s]
Running ‘clusterGammaLikelihood.R’ [1s/2s]
Running ‘simulHeterogeneous.R’ [0s/0s]
Running ‘simulNonLinear.R’ [1s/1s]
Running ‘testAllLearners.R’ [1s/2s]
Running ‘testPoissonExample.R’ [1s/2s]
Running ‘testPredict.R’ [7s/9s]
Running the tests in ‘tests/testAllLearners.R’ failed.
Complete output:
> library(MixAll)
Loading required package: rtkore
Loading required package: Rcpp
Attaching package: 'rtkore'
The following object is masked from 'package:Rcpp':
LdFlags
> ## get data and target from iris data set
> data(iris)
> x <- as.matrix(iris[,1:4]); z <- as.vector(iris[,5]); n <- nrow(x); p <- ncol(x)
> ## add missing values at random
> indexes <- matrix(c(round(runif(5,1,n)), round(runif(5,1,p))), ncol=2)
> cbind(indexes, x[indexes])
[,1] [,2] [,3]
[1,] 123 1 7.7
[2,] 44 4 0.6
[3,] 56 4 1.3
[4,] 5 3 1.4
[5,] 11 3 1.5
> x[indexes] <- NA
> ## learn continuous model
> model <- learnDiagGaussian( data=x, labels= z, prop = c(1/3,1/3,1/3)
+ , models = clusterDiagGaussianNames(prop = "equal")
+ , algo = "simul", nbIter = 2, epsilon = 1e-08
+ )
> missingValues(model)
row col value
1 123 1 6.24570571
2 5 3 1.55829071
3 11 3 1.15596516
4 44 4 -0.06993778
5 56 4 1.00855002
> print(model)
****************************************
* model name = gaussian_p_sj
* data =
Sepal.Length Sepal.Width Petal.Length Petal.Width
[1,] 5.10000000 3.50000000 1.40000000 0.20000000
[2,] 4.90000000 3.00000000 1.40000000 0.20000000
[3,] 4.70000000 3.20000000 1.30000000 0.20000000
[4,] 4.60000000 3.10000000 1.50000000 0.20000000
[5,] 5.00000000 3.60000000 1.55829071 0.20000000
[6,] 5.40000000 3.90000000 1.70000000 0.40000000
[7,] 4.60000000 3.40000000 1.40000000 0.30000000
[8,] 5.00000000 3.40000000 1.50000000 0.20000000
[9,] 4.40000000 2.90000000 1.40000000 0.20000000
[10,] 4.90000000 3.10000000 1.50000000 0.10000000
[11,] 5.40000000 3.70000000 1.15596516 0.20000000
[12,] 4.80000000 3.40000000 1.60000000 0.20000000
[13,] 4.80000000 3.00000000 1.40000000 0.10000000
[14,] 4.30000000 3.00000000 1.10000000 0.10000000
[15,] 5.80000000 4.00000000 1.20000000 0.20000000
[16,] 5.70000000 4.40000000 1.50000000 0.40000000
[17,] 5.40000000 3.90000000 1.30000000 0.40000000
[18,] 5.10000000 3.50000000 1.40000000 0.30000000
[19,] 5.70000000 3.80000000 1.70000000 0.30000000
[20,] 5.10000000 3.80000000 1.50000000 0.30000000
[21,] 5.40000000 3.40000000 1.70000000 0.20000000
[22,] 5.10000000 3.70000000 1.50000000 0.40000000
[23,] 4.60000000 3.60000000 1.00000000 0.20000000
[24,] 5.10000000 3.30000000 1.70000000 0.50000000
[25,] 4.80000000 3.40000000 1.90000000 0.20000000
[26,] 5.00000000 3.00000000 1.60000000 0.20000000
[27,] 5.00000000 3.40000000 1.60000000 0.40000000
[28,] 5.20000000 3.50000000 1.50000000 0.20000000
[29,] 5.20000000 3.40000000 1.40000000 0.20000000
[30,] 4.70000000 3.20000000 1.60000000 0.20000000
[31,] 4.80000000 3.10000000 1.60000000 0.20000000
[32,] 5.40000000 3.40000000 1.50000000 0.40000000
[33,] 5.20000000 4.10000000 1.50000000 0.10000000
[34,] 5.50000000 4.20000000 1.40000000 0.20000000
[35,] 4.90000000 3.10000000 1.50000000 0.20000000
[36,] 5.00000000 3.20000000 1.20000000 0.20000000
[37,] 5.50000000 3.50000000 1.30000000 0.20000000
[38,] 4.90000000 3.60000000 1.40000000 0.10000000
[39,] 4.40000000 3.00000000 1.30000000 0.20000000
[40,] 5.10000000 3.40000000 1.50000000 0.20000000
[41,] 5.00000000 3.50000000 1.30000000 0.30000000
[42,] 4.50000000 2.30000000 1.30000000 0.30000000
[43,] 4.40000000 3.20000000 1.30000000 0.20000000
[44,] 5.00000000 3.50000000 1.60000000 -0.06993778
[45,] 5.10000000 3.80000000 1.90000000 0.40000000
[46,] 4.80000000 3.00000000 1.40000000 0.30000000
[47,] 5.10000000 3.80000000 1.60000000 0.20000000
[48,] 4.60000000 3.20000000 1.40000000 0.20000000
[49,] 5.30000000 3.70000000 1.50000000 0.20000000
[50,] 5.00000000 3.30000000 1.40000000 0.20000000
[51,] 7.00000000 3.20000000 4.70000000 1.40000000
[52,] 6.40000000 3.20000000 4.50000000 1.50000000
[53,] 6.90000000 3.10000000 4.90000000 1.50000000
[54,] 5.50000000 2.30000000 4.00000000 1.30000000
[55,] 6.50000000 2.80000000 4.60000000 1.50000000
[56,] 5.70000000 2.80000000 4.50000000 1.00855002
[57,] 6.30000000 3.30000000 4.70000000 1.60000000
[58,] 4.90000000 2.40000000 3.30000000 1.00000000
[59,] 6.60000000 2.90000000 4.60000000 1.30000000
[60,] 5.20000000 2.70000000 3.90000000 1.40000000
[61,] 5.00000000 2.00000000 3.50000000 1.00000000
[62,] 5.90000000 3.00000000 4.20000000 1.50000000
[63,] 6.00000000 2.20000000 4.00000000 1.00000000
[64,] 6.10000000 2.90000000 4.70000000 1.40000000
[65,] 5.60000000 2.90000000 3.60000000 1.30000000
[66,] 6.70000000 3.10000000 4.40000000 1.40000000
[67,] 5.60000000 3.00000000 4.50000000 1.50000000
[68,] 5.80000000 2.70000000 4.10000000 1.00000000
[69,] 6.20000000 2.20000000 4.50000000 1.50000000
[70,] 5.60000000 2.50000000 3.90000000 1.10000000
[71,] 5.90000000 3.20000000 4.80000000 1.80000000
[72,] 6.10000000 2.80000000 4.00000000 1.30000000
[73,] 6.30000000 2.50000000 4.90000000 1.50000000
[74,] 6.10000000 2.80000000 4.70000000 1.20000000
[75,] 6.40000000 2.90000000 4.30000000 1.30000000
[76,] 6.60000000 3.00000000 4.40000000 1.40000000
[77,] 6.80000000 2.80000000 4.80000000 1.40000000
[78,] 6.70000000 3.00000000 5.00000000 1.70000000
[79,] 6.00000000 2.90000000 4.50000000 1.50000000
[80,] 5.70000000 2.60000000 3.50000000 1.00000000
[81,] 5.50000000 2.40000000 3.80000000 1.10000000
[82,] 5.50000000 2.40000000 3.70000000 1.00000000
[83,] 5.80000000 2.70000000 3.90000000 1.20000000
[84,] 6.00000000 2.70000000 5.10000000 1.60000000
[85,] 5.40000000 3.00000000 4.50000000 1.50000000
[86,] 6.00000000 3.40000000 4.50000000 1.60000000
[87,] 6.70000000 3.10000000 4.70000000 1.50000000
[88,] 6.30000000 2.30000000 4.40000000 1.30000000
[89,] 5.60000000 3.00000000 4.10000000 1.30000000
[90,] 5.50000000 2.50000000 4.00000000 1.30000000
[91,] 5.50000000 2.60000000 4.40000000 1.20000000
[92,] 6.10000000 3.00000000 4.60000000 1.40000000
[93,] 5.80000000 2.60000000 4.00000000 1.20000000
[94,] 5.00000000 2.30000000 3.30000000 1.00000000
[95,] 5.60000000 2.70000000 4.20000000 1.30000000
[96,] 5.70000000 3.00000000 4.20000000 1.20000000
[97,] 5.70000000 2.90000000 4.20000000 1.30000000
[98,] 6.20000000 2.90000000 4.30000000 1.30000000
[99,] 5.10000000 2.50000000 3.00000000 1.10000000
[100,] 5.70000000 2.80000000 4.10000000 1.30000000
[101,] 6.30000000 3.30000000 6.00000000 2.50000000
[102,] 5.80000000 2.70000000 5.10000000 1.90000000
[103,] 7.10000000 3.00000000 5.90000000 2.10000000
[104,] 6.30000000 2.90000000 5.60000000 1.80000000
[105,] 6.50000000 3.00000000 5.80000000 2.20000000
[106,] 7.60000000 3.00000000 6.60000000 2.10000000
[107,] 4.90000000 2.50000000 4.50000000 1.70000000
[108,] 7.30000000 2.90000000 6.30000000 1.80000000
[109,] 6.70000000 2.50000000 5.80000000 1.80000000
[110,] 7.20000000 3.60000000 6.10000000 2.50000000
[111,] 6.50000000 3.20000000 5.10000000 2.00000000
[112,] 6.40000000 2.70000000 5.30000000 1.90000000
[113,] 6.80000000 3.00000000 5.50000000 2.10000000
[114,] 5.70000000 2.50000000 5.00000000 2.00000000
[115,] 5.80000000 2.80000000 5.10000000 2.40000000
[116,] 6.40000000 3.20000000 5.30000000 2.30000000
[117,] 6.50000000 3.00000000 5.50000000 1.80000000
[118,] 7.70000000 3.80000000 6.70000000 2.20000000
[119,] 7.70000000 2.60000000 6.90000000 2.30000000
[120,] 6.00000000 2.20000000 5.00000000 1.50000000
[121,] 6.90000000 3.20000000 5.70000000 2.30000000
[122,] 5.60000000 2.80000000 4.90000000 2.00000000
[123,] 6.24570571 2.80000000 6.70000000 2.00000000
[124,] 6.30000000 2.70000000 4.90000000 1.80000000
[125,] 6.70000000 3.30000000 5.70000000 2.10000000
[126,] 7.20000000 3.20000000 6.00000000 1.80000000
[127,] 6.20000000 2.80000000 4.80000000 1.80000000
[128,] 6.10000000 3.00000000 4.90000000 1.80000000
[129,] 6.40000000 2.80000000 5.60000000 2.10000000
[130,] 7.20000000 3.00000000 5.80000000 1.60000000
[131,] 7.40000000 2.80000000 6.10000000 1.90000000
[132,] 7.90000000 3.80000000 6.40000000 2.00000000
[133,] 6.40000000 2.80000000 5.60000000 2.20000000
[134,] 6.30000000 2.80000000 5.10000000 1.50000000
[135,] 6.10000000 2.60000000 5.60000000 1.40000000
[136,] 7.70000000 3.00000000 6.10000000 2.30000000
[137,] 6.30000000 3.40000000 5.60000000 2.40000000
[138,] 6.40000000 3.10000000 5.50000000 1.80000000
[139,] 6.00000000 3.00000000 4.80000000 1.80000000
[140,] 6.90000000 3.10000000 5.40000000 2.10000000
[141,] 6.70000000 3.10000000 5.60000000 2.40000000
[142,] 6.90000000 3.10000000 5.10000000 2.30000000
[143,] 5.80000000 2.70000000 5.10000000 1.90000000
[144,] 6.80000000 3.20000000 5.90000000 2.30000000
[145,] 6.70000000 3.30000000 5.70000000 2.50000000
[146,] 6.70000000 3.00000000 5.20000000 2.30000000
[147,] 6.30000000 2.50000000 5.00000000 1.90000000
[148,] 6.50000000 3.00000000 5.20000000 2.00000000
[149,] 6.20000000 3.40000000 5.40000000 2.30000000
[150,] 5.90000000 3.00000000 5.10000000 1.80000000
* missing =
row col
[1,] 123 1
[2,] 5 3
[3,] 11 3
[4,] 44 4
[5,] 56 4
* nbSample = 150
* nbCluster = 3
* lnLikelihood = -1013.786
* nbFreeParameter= 70
* criterion name = ICL
* criterion value= 2385.587
* zi =
[1] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
[38] 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
[75] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 2
[112] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
[149] 2 2
****************************************
*** Cluster: 1
* Proportion = 0.3333333
* Means = 5.0060000 3.4280000 1.4582851 0.2326012
* S.D. = 0.5019596 0.3362737 0.4267673 0.2036422
****************************************
*** Cluster: 2
* Proportion = 0.3333333
* Means = 5.936000 2.770000 4.260000 1.320171
* S.D. = 0.5019596 0.3362737 0.4267673 0.2036422
****************************************
*** Cluster: 3
* Proportion = 0.3333333
* Means = 6.558914 2.974000 5.552000 2.026000
* S.D. = 0.5019596 0.3362737 0.4267673 0.2036422
****************************************
> model <- learnDiagGaussian( data=x, labels= z,
+ , models = clusterDiagGaussianNames(prop = "equal")
+ , algo = "impute", nbIter = 2, epsilon = 1e-08)
> missingValues(model)
row col value
> print(model)
****************************************
* model name = gaussian_p_sjk
* data =
Sepal.Length Sepal.Width Petal.Length Petal.Width
[1,] 5.10000000 3.50000000 1.40000000 0.20000000
[2,] 4.90000000 3.00000000 1.40000000 0.20000000
[3,] 4.70000000 3.20000000 1.30000000 0.20000000
[4,] 4.60000000 3.10000000 1.50000000 0.20000000
[5,] 5.00000000 3.60000000 1.55829071 0.20000000
[6,] 5.40000000 3.90000000 1.70000000 0.40000000
[7,] 4.60000000 3.40000000 1.40000000 0.30000000
[8,] 5.00000000 3.40000000 1.50000000 0.20000000
[9,] 4.40000000 2.90000000 1.40000000 0.20000000
[10,] 4.90000000 3.10000000 1.50000000 0.10000000
[11,] 5.40000000 3.70000000 1.15596516 0.20000000
[12,] 4.80000000 3.40000000 1.60000000 0.20000000
[13,] 4.80000000 3.00000000 1.40000000 0.10000000
[14,] 4.30000000 3.00000000 1.10000000 0.10000000
[15,] 5.80000000 4.00000000 1.20000000 0.20000000
[16,] 5.70000000 4.40000000 1.50000000 0.40000000
[17,] 5.40000000 3.90000000 1.30000000 0.40000000
[18,] 5.10000000 3.50000000 1.40000000 0.30000000
[19,] 5.70000000 3.80000000 1.70000000 0.30000000
[20,] 5.10000000 3.80000000 1.50000000 0.30000000
[21,] 5.40000000 3.40000000 1.70000000 0.20000000
[22,] 5.10000000 3.70000000 1.50000000 0.40000000
[23,] 4.60000000 3.60000000 1.00000000 0.20000000
[24,] 5.10000000 3.30000000 1.70000000 0.50000000
[25,] 4.80000000 3.40000000 1.90000000 0.20000000
[26,] 5.00000000 3.00000000 1.60000000 0.20000000
[27,] 5.00000000 3.40000000 1.60000000 0.40000000
[28,] 5.20000000 3.50000000 1.50000000 0.20000000
[29,] 5.20000000 3.40000000 1.40000000 0.20000000
[30,] 4.70000000 3.20000000 1.60000000 0.20000000
[31,] 4.80000000 3.10000000 1.60000000 0.20000000
[32,] 5.40000000 3.40000000 1.50000000 0.40000000
[33,] 5.20000000 4.10000000 1.50000000 0.10000000
[34,] 5.50000000 4.20000000 1.40000000 0.20000000
[35,] 4.90000000 3.10000000 1.50000000 0.20000000
[36,] 5.00000000 3.20000000 1.20000000 0.20000000
[37,] 5.50000000 3.50000000 1.30000000 0.20000000
[38,] 4.90000000 3.60000000 1.40000000 0.10000000
[39,] 4.40000000 3.00000000 1.30000000 0.20000000
[40,] 5.10000000 3.40000000 1.50000000 0.20000000
[41,] 5.00000000 3.50000000 1.30000000 0.30000000
[42,] 4.50000000 2.30000000 1.30000000 0.30000000
[43,] 4.40000000 3.20000000 1.30000000 0.20000000
[44,] 5.00000000 3.50000000 1.60000000 -0.06993778
[45,] 5.10000000 3.80000000 1.90000000 0.40000000
[46,] 4.80000000 3.00000000 1.40000000 0.30000000
[47,] 5.10000000 3.80000000 1.60000000 0.20000000
[48,] 4.60000000 3.20000000 1.40000000 0.20000000
[49,] 5.30000000 3.70000000 1.50000000 0.20000000
[50,] 5.00000000 3.30000000 1.40000000 0.20000000
[51,] 7.00000000 3.20000000 4.70000000 1.40000000
[52,] 6.40000000 3.20000000 4.50000000 1.50000000
[53,] 6.90000000 3.10000000 4.90000000 1.50000000
[54,] 5.50000000 2.30000000 4.00000000 1.30000000
[55,] 6.50000000 2.80000000 4.60000000 1.50000000
[56,] 5.70000000 2.80000000 4.50000000 1.00855002
[57,] 6.30000000 3.30000000 4.70000000 1.60000000
[58,] 4.90000000 2.40000000 3.30000000 1.00000000
[59,] 6.60000000 2.90000000 4.60000000 1.30000000
[60,] 5.20000000 2.70000000 3.90000000 1.40000000
[61,] 5.00000000 2.00000000 3.50000000 1.00000000
[62,] 5.90000000 3.00000000 4.20000000 1.50000000
[63,] 6.00000000 2.20000000 4.00000000 1.00000000
[64,] 6.10000000 2.90000000 4.70000000 1.40000000
[65,] 5.60000000 2.90000000 3.60000000 1.30000000
[66,] 6.70000000 3.10000000 4.40000000 1.40000000
[67,] 5.60000000 3.00000000 4.50000000 1.50000000
[68,] 5.80000000 2.70000000 4.10000000 1.00000000
[69,] 6.20000000 2.20000000 4.50000000 1.50000000
[70,] 5.60000000 2.50000000 3.90000000 1.10000000
[71,] 5.90000000 3.20000000 4.80000000 1.80000000
[72,] 6.10000000 2.80000000 4.00000000 1.30000000
[73,] 6.30000000 2.50000000 4.90000000 1.50000000
[74,] 6.10000000 2.80000000 4.70000000 1.20000000
[75,] 6.40000000 2.90000000 4.30000000 1.30000000
[76,] 6.60000000 3.00000000 4.40000000 1.40000000
[77,] 6.80000000 2.80000000 4.80000000 1.40000000
[78,] 6.70000000 3.00000000 5.00000000 1.70000000
[79,] 6.00000000 2.90000000 4.50000000 1.50000000
[80,] 5.70000000 2.60000000 3.50000000 1.00000000
[81,] 5.50000000 2.40000000 3.80000000 1.10000000
[82,] 5.50000000 2.40000000 3.70000000 1.00000000
[83,] 5.80000000 2.70000000 3.90000000 1.20000000
[84,] 6.00000000 2.70000000 5.10000000 1.60000000
[85,] 5.40000000 3.00000000 4.50000000 1.50000000
[86,] 6.00000000 3.40000000 4.50000000 1.60000000
[87,] 6.70000000 3.10000000 4.70000000 1.50000000
[88,] 6.30000000 2.30000000 4.40000000 1.30000000
[89,] 5.60000000 3.00000000 4.10000000 1.30000000
[90,] 5.50000000 2.50000000 4.00000000 1.30000000
[91,] 5.50000000 2.60000000 4.40000000 1.20000000
[92,] 6.10000000 3.00000000 4.60000000 1.40000000
[93,] 5.80000000 2.60000000 4.00000000 1.20000000
[94,] 5.00000000 2.30000000 3.30000000 1.00000000
[95,] 5.60000000 2.70000000 4.20000000 1.30000000
[96,] 5.70000000 3.00000000 4.20000000 1.20000000
[97,] 5.70000000 2.90000000 4.20000000 1.30000000
[98,] 6.20000000 2.90000000 4.30000000 1.30000000
[99,] 5.10000000 2.50000000 3.00000000 1.10000000
[100,] 5.70000000 2.80000000 4.10000000 1.30000000
[101,] 6.30000000 3.30000000 6.00000000 2.50000000
[102,] 5.80000000 2.70000000 5.10000000 1.90000000
[103,] 7.10000000 3.00000000 5.90000000 2.10000000
[104,] 6.30000000 2.90000000 5.60000000 1.80000000
[105,] 6.50000000 3.00000000 5.80000000 2.20000000
[106,] 7.60000000 3.00000000 6.60000000 2.10000000
[107,] 4.90000000 2.50000000 4.50000000 1.70000000
[108,] 7.30000000 2.90000000 6.30000000 1.80000000
[109,] 6.70000000 2.50000000 5.80000000 1.80000000
[110,] 7.20000000 3.60000000 6.10000000 2.50000000
[111,] 6.50000000 3.20000000 5.10000000 2.00000000
[112,] 6.40000000 2.70000000 5.30000000 1.90000000
[113,] 6.80000000 3.00000000 5.50000000 2.10000000
[114,] 5.70000000 2.50000000 5.00000000 2.00000000
[115,] 5.80000000 2.80000000 5.10000000 2.40000000
[116,] 6.40000000 3.20000000 5.30000000 2.30000000
[117,] 6.50000000 3.00000000 5.50000000 1.80000000
[118,] 7.70000000 3.80000000 6.70000000 2.20000000
[119,] 7.70000000 2.60000000 6.90000000 2.30000000
[120,] 6.00000000 2.20000000 5.00000000 1.50000000
[121,] 6.90000000 3.20000000 5.70000000 2.30000000
[122,] 5.60000000 2.80000000 4.90000000 2.00000000
[123,] 6.24570571 2.80000000 6.70000000 2.00000000
[124,] 6.30000000 2.70000000 4.90000000 1.80000000
[125,] 6.70000000 3.30000000 5.70000000 2.10000000
[126,] 7.20000000 3.20000000 6.00000000 1.80000000
[127,] 6.20000000 2.80000000 4.80000000 1.80000000
[128,] 6.10000000 3.00000000 4.90000000 1.80000000
[129,] 6.40000000 2.80000000 5.60000000 2.10000000
[130,] 7.20000000 3.00000000 5.80000000 1.60000000
[131,] 7.40000000 2.80000000 6.10000000 1.90000000
[132,] 7.90000000 3.80000000 6.40000000 2.00000000
[133,] 6.40000000 2.80000000 5.60000000 2.20000000
[134,] 6.30000000 2.80000000 5.10000000 1.50000000
[135,] 6.10000000 2.60000000 5.60000000 1.40000000
[136,] 7.70000000 3.00000000 6.10000000 2.30000000
[137,] 6.30000000 3.40000000 5.60000000 2.40000000
[138,] 6.40000000 3.10000000 5.50000000 1.80000000
[139,] 6.00000000 3.00000000 4.80000000 1.80000000
[140,] 6.90000000 3.10000000 5.40000000 2.10000000
[141,] 6.70000000 3.10000000 5.60000000 2.40000000
[142,] 6.90000000 3.10000000 5.10000000 2.30000000
[143,] 5.80000000 2.70000000 5.10000000 1.90000000
[144,] 6.80000000 3.20000000 5.90000000 2.30000000
[145,] 6.70000000 3.30000000 5.70000000 2.50000000
[146,] 6.70000000 3.00000000 5.20000000 2.30000000
[147,] 6.30000000 2.50000000 5.00000000 1.90000000
[148,] 6.50000000 3.00000000 5.20000000 2.00000000
[149,] 6.20000000 3.40000000 5.40000000 2.30000000
[150,] 5.90000000 3.00000000 5.10000000 1.80000000
* missing =
row col
* nbSample = 150
* nbCluster = 3
* lnLikelihood = -1017.688
* nbFreeParameter= 70
* criterion name = ICL
* criterion value= 2393.331
* zi =
[1] 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
[38] 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
[75] 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 2
[112] 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2 2
[149] 2 2
****************************************
*** Cluster: 1
* Proportion = 0.3333333
* Means = 5.0060000 3.4280000 1.4582851 0.2326012
* S.D. = 0.3489470 0.3752546 0.1774684 0.1009678
****************************************
*** Cluster: 2
* Proportion = 0.3333333
* Means = 5.936000 2.770000 4.260000 1.320171
* S.D. = 0.5109834 0.3106445 0.4651881 0.2007287
****************************************
*** Cluster: 3
* Proportion = 0.3333333
* Means = 6.558914 2.974000 5.552000 2.026000
* S.D. = 0.6107556 0.3192554 0.5463479 0.2718897
****************************************
> model <- learnGamma( data=x, labels= z,
+ , models = clusterGammaNames(prop = "equal")
+ , algo = "simul", nbIter = 2, epsilon = 1e-08
+ )
*** caught segfault ***
address 0xe0, cause 'memory not mapped'
Traceback:
1: learnGamma(data = x, labels = z, , models = clusterGammaNames(prop = "equal"), algo = "simul", nbIter = 2, epsilon = 1e-08)
An irrecoverable exception occurred. R is aborting now ...
Segmentation fault
Flavor: r-devel-linux-x86_64-debian-clang
Version: 1.5.1
Check: whether package can be installed
Result: WARN
Found the following significant warnings:
ClusterLauncher.cpp:201:16: warning: format string is not a string literal (potentially insecure) [-Wformat-security]
ClusterLauncher.cpp:291:16: warning: format string is not a string literal (potentially insecure) [-Wformat-security]
KmmLauncher.cpp:283:16: warning: format string is not a string literal (potentially insecure) [-Wformat-security]
KmmLauncher.cpp:392:16: warning: format string is not a string literal (potentially insecure) [-Wformat-security]
LearnLauncher.cpp:202:16: warning: format string is not a string literal (potentially insecure) [-Wformat-security]
LearnLauncher.cpp:289:16: warning: format string is not a string literal (potentially insecure) [-Wformat-security]
See ‘/data/gannet/ripley/R/packages/tests-clang/MixAll.Rcheck/00install.out’ for details.
* used C compiler: ‘clang version 17.0.5’
* used C++ compiler: ‘clang version 17.0.5’
Flavor: r-devel-linux-x86_64-fedora-clang
Version: 1.5.1
Check: whether package can be installed
Result: ERROR
Installation failed.
Flavors: r-devel-windows-x86_64, r-release-windows-x86_64
Version: 1.5.1
Check: installed package size
Result: NOTE
installed size is 27.9Mb
sub-directories of 1Mb or more:
libs 25.7Mb
Flavors: r-release-macos-arm64, r-release-macos-x86_64, r-oldrel-macos-arm64, r-oldrel-macos-x86_64