Skip to content

Commit

Permalink
feat!: rename blog to "Blog" (not "Posts")
Browse files Browse the repository at this point in the history
  • Loading branch information
jolars committed May 25, 2024
1 parent 6e47991 commit 693a85d
Show file tree
Hide file tree
Showing 104 changed files with 1,031 additions and 23 deletions.
12 changes: 12 additions & 0 deletions _freeze/blog/2024-05-25-moloch/index/execute-results/html.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
{
"hash": "6bb6c4fffa3862292ae04c3ce364c261",
"result": {
"engine": "jupyter",
"markdown": "---\ntitle: \"Moloch: A Revival of the Metropolis LaTeX Beamer Theme\"\nauthor: Johan Larsson\ndate: \"2024-05-15\"\ndescription: |\n Metropolis is a popular and modern beamer theme for LaTeX, but it is\n unfortunately no longer actively maintained and contains several bugs. \n Moloch is a fork of Metropolis that aims to fix these issues and add\n a few new features.\ncategories:\n - latex\n - software\n - presentations\n - beamer\nimage: moloch-logo.svg\nfig-cap-location: margin\n---\n\nMy beamer configuration for LaTeX presentations has been in a state of flux for as long as I can remember. \nI have tried many different themes and configurations, and have typically tried to keep the theme minimalisic\nbut at the same time functional and visually appealing. Nevertheless, I have frequently found\nmyself scrapping my custom modifications and returning to the [Metropolis theme](https://github.com/matze/mtheme),\nwhich I think is the most well-designed theme that I have so far encountered for beamer.\n\nThe only problem is that Metropolis is no longer actively maintained. The latest update (at the time of writing)\nwas six years ago and since then a number of issues have cropped up. Most of them are not major and can\nbe circumvented through various hacks and workarounds, but I have grown increasingly frustrated with the\nseparate file of Metropolis patches that I've had to keep around to fix these issues.\nBeamer tself in fact even includes several patches now in order to stop the theme from breaking (see \n[here](https://github.com/josephwright/beamer/blob/c0d91f15165421646b5383546e6195187b7f97c9/base/beamerbasecompatibility.sty#L253),\n[here](https://github.com/josephwright/beamer/blob/c0d91f15165421646b5383546e6195187b7f97c9/base/beamerbasesection.sty#L209), and \n[here](https://github.com/josephwright/beamer/blob/c0d91f15165421646b5383546e6195187b7f97c9/base/beamerbaselocalstructure.sty#L29).)\n\n## Moloch\n\nThis has now (since some months back actually) led me to fork Metropolis to try to fix these outstanding issues.\nI call the new theme *Moloch* (which is likely familiar to you if you know your Metropolis).\nThe original design is still pretty much intact save for a few minor tweaks, such as changing the green\ncolor from Metropolis to a teal color that is a little more color-blind friendly. The largest differences,\nexcept for pure bug fixes, is that the code for the theme has been simplified and made more robust.\nMetropolis, for instance, made much use of `\\patchcmd` from the etoolbox package to patch beamer \ntheme internals in order to support modifications to, for instance, frame titles. This was what\nlead the theme to break in the first place as beamer introduced changes in these commands and I \nhave thus opted to remove all this kind of patching in favor of relying on standard functionality\nfrom the beamer theme instead.\n\nThis comes at the price of sacrificing some features, such as toggling title formatting between uppercase,\nsmall caps, and regular text. But, as the Metropolis documentation itself noted, these\nmodifications were problematic in the first place and I therefore think that the removal of these\nfeatures is on the whole a good thing.\n\nI've also removed the pgfplots theme that was included in Metropolis. I don't mind the theme per se,\nbut I believe it was only of limited use and \n\n## Changes\n\nI've tried to outline the main changes that I can think of in the following sections.\n\n### New Secondary Color\n\nI always though the green color in Metropolis was lurid and not exactly color-blind friendly.\nI therefore changed it to a teal color that I think is a little more subdued and easier on the eyes.\n\nYou can see the difference in the figure below. I hope you agree that the\nnew color is an improvement!\n\n![The old versus the new secondary color.](new-secondary-color.svg){width=100%}\n\nOne of the future plans for Moloch is to make these colors more easily customizable.\n\n### Subtitles\n\nSubtitles are now supported in Moloch. They were were not supported in Metropolis\nbecause the author [thought subtitles were a bad idea i general](https://github.com/matze/mtheme/issues/135).\nOn the whole I agree that subtitles are usually best avoided, but I didn't see any reason\nto impose this opinion on others. Subtitles are therefore supported in Moloch.\n\n![Subtitles are supported in Moloch.](subtitles.png)\n\n### Frame Numbering\n\nMetropolis sported its own frame numbering system. There was nothing wrong with this \nsystem except it necessitated a separate package (appendixframenumber) to get frame/page numbers \nto restart (and not count towards the total number) for appendix slides.\nBeamer has, however, improved its own system in recent years and there is no longer\nany need for a custom solution (or separate package) to support this functionality.\nAs a result, Moloch just relies on beamer templates for the frame numbering.\nThe design is *slightly* different, with smaller font size and somewhat different margins,\nbut I only think this is for the better anyway.\n\nNow, you can just change it via the standard beamer commands for frame numbering, like so:\n\n```latex\n\\setbeamertemplate{page number in head/foot}[appendixframenumber]\n```\n\n### Title Page Redesign\n\nThe title page has been redesigned. The primary changes are\n\n1. The institute is now positioned below the author (rather than the date), which I think makes more sense.\n This was suggested in [an issue on the Metropolis repo](https://github.com/matze/mtheme/issues/180).\n2. The titlegraphic now has margins added above and below. It was previously put in a zero-height\n `vbox`, but this caused inconsistent bottom-top margins for the content on the title page.\n You can still achieve the behavior just be wrapping the title graphic in the same kind of\n `vbox`, however, so I see this as a less imposing default.\n3. The margins around the elements on the title page were changed everywhere. Please see the screenshots\n below to see what I mean, but the main change is that there is less spacing between the title and the\n subtitle and even spacing above and below the orange line.\n\n![The old title page from Metropolis](metropolis-titlepage.svg){width=100%}\n\n![The new title page in Moloch](moloch-titlepage.svg){width=100%}\n\nFor reference, the code for generating the slides is given below\n\n\n```{latex}\n#| code-fold: true\n#| eval: false\n\\documentclass[10pt]{beamer}\n\n\\usetheme{moloch}\n\n\\title{Title}\n\\subtitle{Subtitle}\n\\author{Author}\n\\institute{Institute}\n\\date{\\today}\n\\titlegraphic{\\hfill\\includegraphics[height=2cm]{logo.pdf}}\n\n\\begin{document}\n\\maketitle\n\\end{document}\n```\n\n\n### Better Support for Wide Presentations\n\nhttps://github.com/matze/mtheme/pull/384\n\n",
"supporting": [
"index_files"
],
"filters": [],
"includes": {}
}
}
4 changes: 2 additions & 2 deletions _freeze/blog/slope-0-2-0/index/execute-results/html.json
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
{
"hash": "b3b9685ad0551ad36272986315b25bbd",
"hash": "e083f5e1da4abc3d4edf54eb61515908",
"result": {
"markdown": "---\ntitle: \"SLOPE 0.2.0\"\nauthor: Johan Larsson\ndate: \"2020-04-14\"\ndescription: A new update to the SLOPE package with many exciting features.\ncategories:\n - r\n - regularization\n - SLOPE\n - statistics\nimage: slope.svg\nbibliography: bibliography.bib\n---\n\n\n\n\n## Introduction to SLOPE\n\nSLOPE [@bogdan2015] stands for sorted L1 penalized estimation and\nis a generalization of OSCAR [@bondell2008]. As the name \nsuggests, SLOPE\nis a type of $\\ell_1$-regularization. More specifically, SLOPE fits \ngeneralized linear models regularized with the sorted $\\ell_1$ norm. The\nobjective in SLOPE is\n\n$$\n\\operatorname{minimize}\\left\\{ f(\\beta) + J(\\beta \\mid \\lambda)\\right\\},\n$$\n\nwhere $f(\\beta)$ is typically the log-likelihood of some model in the \nfamily of generalized linear models and \n\n$$J(\\beta\\mid \\lambda) = \\sum_{i=1}^p \\lambda_i|\\beta|_{(i)}$$\n\nis the\nsorted $\\ell_1$ norm.\n\nSome people will note that this penalty is a generalization\nof the standard $\\ell_1$ norm penalty[^1]. As such,\nSLOPE is a type of sparse regression---just like the lasso. Unlike the lasso,\nhowever, SLOPE gracefully handles correlated features.\nWhereas the lasso often discards all but a few among a set of \ncorrelated features [@jia2010], \nSLOPE instead *clusters* such features together by setting such clusters to\nhave the same coefficient in absolut value.\n\n[^1]: Simply set $\\lambda_i = \\lambda_j$ for all $i,j \\in \\{1,\\dots,p\\}$ and you get the lasso penalty.\n\n## SLOPE 0.2.0\n\nSLOPE 0.2.0 is a new verison of the R package\n[SLOPE](https://CRAN.R-project.org/package=SLOPE) featuring a range of\nimprovements over the previous package. If you are completely new to the \npackage, please start with the [introductory vignette](https://jolars.github.io/SLOPE/articles/introduction.html).\n\n### More model families\n\nPreviously, SLOPE only features ordinary least-squares regression. Now the\npackage features logistic, Poisson, and multinomial regression on top of that.\nJust as in other similar packages, this is enabled simply by\nsetting `family = \"binomial\"` for logistic regression, for instance.\n\n\n::: {.cell layout-align=\"center\"}\n\n```{.r .cell-code}\nlibrary(SLOPE)\nfit <- SLOPE(wine$x, wine$y, family = \"multinomial\")\n```\n:::\n\n\n### Regularization path fitting\n\nBy default, SLOPE now fits a full regularization path instead of\nonly a single penalty sequence at once. This behavior is now analogous with the \ndefault behavior in glmnet.\n\n\n::: {.cell layout-align=\"center\"}\n\n```{.r .cell-code}\nplot(fit)\n```\n\n::: {.cell-output-display}\n![Coefficients from the regularization path for a multinomial model.](index_files/figure-html/unnamed-chunk-3-1.png){fig-align='center' width=768}\n:::\n:::\n\n\n### Predictor screening rules\n\nThe package now uses predictor screening rules to vastly improve performance\nin the $p \\gg n$ domain. Screening rules are part of what makes\nother related packages such as glmnet so efficient. In SLOPE, we use a\nvariant of the strong screening rules for the lasso [@tibshirani2012].\n\n\n::: {.cell layout-align=\"center\"}\n\n```{.r .cell-code}\nxy <- SLOPE:::randomProblem(100, 1000)\nsystem.time({SLOPE(xy$x, xy$y, screen = TRUE)})\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n user system elapsed \n 0.602 0.000 0.079 \n```\n:::\n\n```{.r .cell-code}\nsystem.time({SLOPE(xy$x, xy$y, screen = FALSE)})\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n user system elapsed \n 3.937 0.006 0.527 \n```\n:::\n:::\n\n\n### Cross-validation and caret\n\nThere is now a function `trainSLOPE()`, which can be used to run\ncross-validation for optimal selection of `sigma` and `q`. Here, we run\n8-fold cross-validation repeated 5 times.\n\n\n::: {.cell layout-align=\"center\"}\n\n```{.r .cell-code}\n# 8-fold cross-validation repeated 5 times\ntune <- trainSLOPE(\n subset(mtcars, select = c(\"mpg\", \"drat\", \"wt\")),\n mtcars$hp,\n q = c(0.1, 0.2),\n number = 8,\n repeats = 5\n)\nplot(tune)\n```\n\n::: {.cell-output-display}\n![Cross-validation with SLOPE.](index_files/figure-html/unnamed-chunk-5-1.png){fig-align='center' width=672}\n:::\n:::\n\n\nIn addition, the package now also features a function `caretSLOPE()` that\ncan be used via the excellent caret package, which enables a swath\nof resampling methods and comparisons.\n\n### C++ and ADMM\n\nAll of the performance-critical code for SLOPE has been rewritten in \nC++. In addition, the package now features an ADMM solver for\n`family = \"gaussian\"`, enabled by setting `solver = \"admm\"` in the call\nto `SLOPE()`. Preliminary testing shows that this solver is faster for\nmany designs, particularly when there is high correlation among predictors.\n\n### Sparse design matrices\n\nSLOPE now also allows sparse design matrcies of classes from the Matrix package.\n\n### And much more...\n\nFor a full list of changes, please\nsee [the changelog](https://jolars.github.io/SLOPE/news/index.html#slope-0-2-0-unreleased).\n\n## References\n\n",
"markdown": "---\ntitle: \"SLOPE 0.2.0\"\nauthor: Johan Larsson\ndate: \"2020-04-14\"\ndescription: A new update to the SLOPE package with many exciting features.\ncategories:\n - r\n - SLOPE\n - statistics\nimage: slope.svg\n---\n\n\n\n\n## Introduction to SLOPE\n\nSLOPE [@bogdan2015] stands for sorted L1 penalized estimation and\nis a generalization of OSCAR [@bondell2008]. As the name \nsuggests, SLOPE\nis a type of $\\ell_1$-regularization. More specifically, SLOPE fits \ngeneralized linear models regularized with the sorted $\\ell_1$ norm. The\nobjective in SLOPE is\n\n$$\n\\operatorname{minimize}\\left\\{ f(\\beta) + J(\\beta \\mid \\lambda)\\right\\},\n$$\n\nwhere $f(\\beta)$ is typically the log-likelihood of some model in the \nfamily of generalized linear models and \n\n$$J(\\beta\\mid \\lambda) = \\sum_{i=1}^p \\lambda_i|\\beta|_{(i)}$$\n\nis the\nsorted $\\ell_1$ norm.\n\nSome people will note that this penalty is a generalization\nof the standard $\\ell_1$ norm penalty[^1]. As such,\nSLOPE is a type of sparse regression---just like the lasso. Unlike the lasso,\nhowever, SLOPE gracefully handles correlated features.\nWhereas the lasso often discards all but a few among a set of \ncorrelated features [@jia2010], \nSLOPE instead *clusters* such features together by setting such clusters to\nhave the same coefficient in absolut value.\n\n[^1]: Simply set $\\lambda_i = \\lambda_j$ for all $i,j \\in \\{1,\\dots,p\\}$ and you get the lasso penalty.\n\n## SLOPE 0.2.0\n\nSLOPE 0.2.0 is a new verison of the R package\n[SLOPE](https://CRAN.R-project.org/package=SLOPE) featuring a range of\nimprovements over the previous package. If you are completely new to the \npackage, please start with the [introductory vignette](https://jolars.github.io/SLOPE/articles/introduction.html).\n\n### More model families\n\nPreviously, SLOPE only features ordinary least-squares regression. Now the\npackage features logistic, Poisson, and multinomial regression on top of that.\nJust as in other similar packages, this is enabled simply by\nsetting `family = \"binomial\"` for logistic regression, for instance.\n\n\n::: {.cell layout-align=\"center\"}\n\n```{.r .cell-code}\nlibrary(SLOPE)\nfit <- SLOPE(wine$x, wine$y, family = \"multinomial\")\n```\n:::\n\n\n### Regularization path fitting\n\nBy default, SLOPE now fits a full regularization path instead of\nonly a single penalty sequence at once. This behavior is now analogous with the \ndefault behavior in glmnet.\n\n\n::: {.cell layout-align=\"center\"}\n\n```{.r .cell-code}\nplot(fit)\n```\n\n::: {.cell-output-display}\n![Coefficients from the regularization path for a multinomial model.](index_files/figure-html/unnamed-chunk-3-1.png){fig-align='center' width=768}\n:::\n:::\n\n\n### Predictor screening rules\n\nThe package now uses predictor screening rules to vastly improve performance\nin the $p \\gg n$ domain. Screening rules are part of what makes\nother related packages such as glmnet so efficient. In SLOPE, we use a\nvariant of the strong screening rules for the lasso [@tibshirani2012].\n\n\n::: {.cell layout-align=\"center\"}\n\n```{.r .cell-code}\nxy <- SLOPE:::randomProblem(100, 1000)\nsystem.time({SLOPE(xy$x, xy$y, screen = TRUE)})\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n user system elapsed \n 1.296 0.000 0.168 \n```\n:::\n\n```{.r .cell-code}\nsystem.time({SLOPE(xy$x, xy$y, screen = FALSE)})\n```\n\n::: {.cell-output .cell-output-stdout}\n```\n user system elapsed \n 3.639 0.007 0.463 \n```\n:::\n:::\n\n\n### Cross-validation and caret\n\nThere is now a function `trainSLOPE()`, which can be used to run\ncross-validation for optimal selection of `sigma` and `q`. Here, we run\n8-fold cross-validation repeated 5 times.\n\n\n::: {.cell layout-align=\"center\"}\n\n```{.r .cell-code}\n# 8-fold cross-validation repeated 5 times\ntune <- trainSLOPE(\n subset(mtcars, select = c(\"mpg\", \"drat\", \"wt\")),\n mtcars$hp,\n q = c(0.1, 0.2),\n number = 8,\n repeats = 5\n)\nplot(tune)\n```\n\n::: {.cell-output-display}\n![Cross-validation with SLOPE.](index_files/figure-html/unnamed-chunk-5-1.png){fig-align='center' width=672}\n:::\n:::\n\n\nIn addition, the package now also features a function `caretSLOPE()` that\ncan be used via the excellent caret package, which enables a swath\nof resampling methods and comparisons.\n\n### C++ and ADMM\n\nAll of the performance-critical code for SLOPE has been rewritten in \nC++. In addition, the package now features an ADMM solver for\n`family = \"gaussian\"`, enabled by setting `solver = \"admm\"` in the call\nto `SLOPE()`. Preliminary testing shows that this solver is faster for\nmany designs, particularly when there is high correlation among predictors.\n\n### Sparse design matrices\n\nSLOPE now also allows sparse design matrcies of classes from the Matrix package.\n\n### And much more...\n\nFor a full list of changes, please\nsee [the changelog](https://jolars.github.io/SLOPE/news/index.html#slope-0-2-0-unreleased).\n\n## References\n\n",
"supporting": [
"index_files"
],
Expand Down
Binary file modified _freeze/blog/slope-0-2-0/index/figure-html/unnamed-chunk-3-1.png
Binary file modified _freeze/blog/slope-0-2-0/index/figure-html/unnamed-chunk-5-1.png
Loading

0 comments on commit 693a85d

Please sign in to comment.