Giving your package documentation site a little personality and much more with {pkgdown} customization, plus a novel new package to bring the power of LaTeX to all of your plots!
Episode Links
Episode Links
- This week's curator: Batool Almarzouq - @batoolmm.bsky.social (Bluesky) & @batool664 (X/Twitter)
- Customize your expedition: Create a unique documentation for your R Package
- LaTeX Typesetting in R: The 'xdvir' Package
- Entire issue available at rweekly.org/2025-W11
- {xdvir} package vignette https://cran.r-project.org/web/packages/xdvir/vignettes/xdvir-intro.html
- tinytex- A lightweight, cross-platform, portable, and easy-to-maintain LaTeX distribution based on TeX Live https://yihui.org/tinytex/
- Adding arm64 to r2u https://dirk.eddelbuettel.com/blog/2025/03/04#046_arm64_comes_to_r2u
- Website for tidyplots use cases https://tidyplots.org/use-cases/
- Use the contact page at https://serve.podhome.fm/custompage/r-weekly-highlights/contact to send us your feedback
- R-Weekly Highlights on the Podcastindex.org - You can send a boost into the show directly in the Podcast Index. First, top-up with Alby, and then head over to the R-Weekly Highlights podcast entry on the index.
- A new way to think about value: https://value4value.info
- Get in touch with us on social media
- Eric Nantz: @[email protected] (Mastodon), @rpodcast.bsky.social (BlueSky) and @theRcast (X/Twitter)
- Mike Thomas: @[email protected] (Mastodon), @mike-thomas.bsky.social (BlueSky), and @mike_ketchbrook (X/Twitter)
- A View from the Top (of Shinra HQ) - Final Fantasy VII Remake - TheManPF, Andre Beller, jnWake, Kev Ragone, Michelle Dreyband, newmajoe, Ronin Op F - https://ocremix.org/remix/OCR04777
[00:00:03]
Eric Nantz:
Hello, friends. We're back of episode 98 of the Our Wicked Highlights podcast. Boy, that magic number is getting closer, isn't it? But, nonetheless, this is the weekly show where we talk about the awesome highlights and other resources that are shared every single week at rweekly.0rg. My name is Eric Nance, and I'm delighted to join us from wherever you are around the world. Yes. I am very thankful. We're already in March, so I feel like my mood's lifting up. The sun's coming out more. The time zone chain or the time change a little bit in my neck of the woods, so it's lighter later in the day. Although it's darker in the morning, so give can't have everything at once. But I'm happy, and all things considered. And I'm also happy because returning to the show is my awesome co host, Mike Thomas.
[00:00:49] Mike Thomas:
Mike, we're so glad to have you back. I barely, flew the ship solo last time, so thank you for being here. Well, that's not true at all. I listened to the the great solo episode you recorded, and I appreciate you you doing that and carrying the water for us, for that week. Well, I was out for a couple weeks, unfortunately, but very happy to be back. Thank you to the OurWeekly
[00:01:11] Eric Nantz:
folks for starting me off on a lighter issue this week to ease me back into the OurWeekly groove. Yep. Sometimes we need that little bit of easing into it. But, yeah, let's dive right into it. And our curator this week is Matol Amrazak. And as always, she had tremendous help from our fellow, our weekly team members, and contributors like all of you around the world with your poll requests and other great suggestions. And we're gonna give a little personality to a great topic that, you know, has been one of my, big focuses in the last couple years of a lot of internal projects, but that, of course, is writing good and concise documentation for your packages.
And what better way to surface that information to your end users, whether you're doing something internally or you're releasing this to the the open source world, if you will, is to create a website for your package in r. And there is a package that's been kind of the front runner in this space for quite a few years now, and we are talking about the package down package or p k g d o w n. I always wonder how people say these things, but I'm gonna say package down and run with it. But I've been using it for years, and out of the box, you get a a nice utility type of experience, but there's a lot more you can do with it, folks. And that's where our first highlight comes in and is being authored by Muriel Del Mote. Hopefully, I've said that correctly.
They're from ThinkR. So this is from the ThinkR blog, which is titled customize your expedition, creating unique documentation for your r package. So we first start off in this post with how straightforward it is to get one of these up and running if you leverage one of my favorite utility packages for development in the r space is the use this package. It seems like there's almost nothing use this can't do other than literally just, you know, fix all my errors at once, but maybe that will come later on. But you can bootstrap a, package down site with use this colon colon use underscore package down. When you have your package already configured with your r functions and their function documentation, This is gonna give you that basic structure right off the bat, and you can literally build your package site interactively right then and there with package down build underscore site.
And there you go. You now have if you run a render that in a preview in your editor of choice, you've got a, very, again, utilitarian looking package site right off the bat, which will give you an index of your functions and the reference tab. And if you had any vignettes that you wrote as part of your package, they will also be included in the article section of the package down site. And last but certainly not least right off the bat, if you are using a news file to keep track of your package updates and changes, that will be in the change log section of your package down site.
I visit that section every single time I'm using a package because there's a lot of things the fast moving, packages I'm leveraging lately, and I always like to see just what happened in that update compared to our previous version, especially if you're dealing with, you know, more, how should I say it, like older r versions with older package repositories. And then you start to leverage, like, a development project with RM, then you realize you've updated a new version of that package. So what changed between those two versions? You know, the change log can be helpful in that approach.
So, again, right off the bat, you got a package down site ready to go. But as this blog post is saying, there's a lot more you can do here. Like a lot of things that are fundamentally based of our markdown, which package down is, I'm sure, very much inspired by, A lot of the configuration in your site is driven by a YAML file. The package down dot YAML file to be exact. I swear, I now understand why some people keep saying YAML is almost like a programming language in and of itself because there's so many things that we can figure with it. Right? So we're gonna walk through, in the case of package down, some of the options that you have here, starting which is the navigation bar. I told you what happens by default, but you could organize this any way you see fit with the structure, element.
You can determine what goes on the left, what goes on the right, let's say those icons to say the GitHub repository of your package. You can put any other icon or reference that you want in there. There's a lot more you can do with that. And then to really look at how those navigation elements are constructed, you can have components within each of them. So in the example we see here, they've added a new section called expeditions. And within that expedition section, it's got its own submenu of different elements. This may look familiar because if you've been using portal lately, you can kinda take a similar approach when you make a website with portal and how you construct the navigation bar there as well. So there is interesting some synergies here. It's not always synergistic, but there's some interesting interesting things you can take from constructing, say, a portal site and whatnot. So there's a menu section where they have the hyperlinks and then the text for that hyperlink.
Again, once you grok it and look at it for a little bit, makes makes a lot of sense. And you can also, again, make sure that if you want to space things out, which is a nice navigation feature, there's a little tip that I saw. You can have arbitrary text in these menus and just put a bunch of hyphens in it. Now it gives you a nice little separator between the the different groups if you wanna organize it in that fashion. So little things along the way that I learn every time I read these posts. The other meat of a package down site is typically the function documentation. This is where literally, every day I'm developing or, an internal package or I'm leveraging an internal pipeline that's using a bunch of packages.
I have on the right side of my screen the package downside of that other package, and I'm looking at the documentation right off the bat. Typically, in our studio, you could look at the documentation in the editor itself. Lately, I've been using Positron, and they don't have that nice little help browsing panel just yet. You have to do question mark name of the function, and then it appears. So sometimes I just like having the web browser on the right side with the package documentation right there. That way I can just kind of navigate back and forth. So back to where we're talking about here, in package down site, there is a reference section.
And this is where you can kinda give categories or groupings of your functions without necessarily having to do that in your package documentation itself in the package. Case in point, in the example here, they've got a reference section and with different titles for these groups in the title argument and then a description that gives a more plain text kind of one or two sentence description of these contents. And then inside the contents section are the names of the functions that go into that section. This is really, really important, especially to me as you're developing a package that may have twenty, thirty, sometimes 40 functions, but there there are inherent groupings that you want to use it to quickly understand what certain maybe they're pinpointing an output type of function or a function that's processing data or whatnot.
So a lot of the package downsides that I've seen lately are using this grouping feature really effectively. I'm really appreciative that this is easy to do now in package down to give that more enhanced user experience, browsing the functions in your package. So it's really getting the title, the description, and then the contents being the names of the functions going into that group. And then once you rerender that, you would see then in your reference section, you've got really nice easily, you know, viewable groups for these functions.
All of this at this point has been the structure of your site and literally the elements inside of it. We've left out one important detail that, Mike, you're gonna rectify now. Let's bling this up a little bit, shall we? Let's make it pretty for sure. And that's where,
[00:10:18] Mike Thomas:
you know, CSS and and HTML and and Bootstrap can come into play here. And the folks who have authored the package down package for us, again, we can configure all of these things in this really nice YAML document, underscore package down YAML file as you mentioned. And there's a section that you can add, called template. And underneath that section, there's there's two immediate, subparameters, which are bootstrap, which, specifies the version of bootstrap that you want to use. Typically, nowadays, I think five is the the latest and greatest and has been for some time. Who knows when bootstrap six will drop? And then you can specify, some parameters as well. And, a lot of folks, in my experience, find the boots watch, project to be very handy because it is a set of I don't know how many there are today, Eric. Maybe 30 ish or so, I feel like. Yeah. I gotta say it's at least 20 of these boots watch themes that I've seen in the wild. Yeah. Yeah. Sort of pre specified, different themes in terms of, you know, what color is is the nav bar, the background, the foreground, font, styling, all sorts of things like that. So you can play around with the the 20 or 30 sort of preset options that they provide you, which can be a nice starting point. Maybe that that works perfectly. You find one that that looks exactly like you want.
Or if you wanna go even a little bit further, you can add an extra dot CSS file, in your packages package down directory, and that can help update the CSS properties of your package down site, which is really cool. You can set things like the the navbar colored, you know, navbar, drop down links, you know, the font weight, all sorts of different types of things, any sort of CSS element on your package down site that you might wanna customize, you can handle with this extra dot CSS file, and that allows you to leverage whatever sort of branding you want to incorporate or or anything like that that you think is going to make your package down site pop or fit into sort of the theme and styling that you had in mind for your package. Package down, you know, it's awesome. It's something that we have in the R ecosystem that I think has completely exploded.
I think we've talked about this before, but when I go to a to look up in our package that doesn't have a package down site, it's really sad, because I have to maybe go through, you know, one of the old PDFs, that, you know, is published out there when a package gets published on. Korean, and that's that's certainly not as fun to explore as, you know, an interactive HTML package down site that has the searchability and all those types of cool things. And and maybe one of these days, package down and and web r will collide, and we can run some of these functions, from a package that we're exploring in the browser at the same place where we're looking at the documentation. That is is one of my dreams. We'll see. On a sort of similar tangent, I've been using a lot of, polars from the Python ecosystem lately.
It's I'd call it maybe a dplyr equivalent with a lot of speed improvements, but in terms of syntax, it's it's fairly similar to what you might be familiar with in the tidy verse. And, has a a great site that looks kind of like a package down site, maybe even a little bit more robust because of everything that's going on in that package. And I noticed this week that there's a new little new little icon in the bottom right corner that's in an AI chatbot that allows you to to ask a question of the documentation. And my hope is that it's it's maybe rag based or, you know, fully up to date with the latest developments in the polars, package so that we are not doing the thing where I go to chat g p t, which was trained on polar's six months ago.
And I am writing legacy code sometimes as opposed to, you know, writing the correct code based upon where the the package and the project stands at this point in time. So maybe we'll see something like that come to, the package down ecosystem as well, but I thought that was an interesting and relevant anecdote.
[00:14:41] Eric Nantz:
Oh, yeah. That that is intriguing for sure. And, you know, our listeners might be out there wondering, well, how do I actually know if this package I'm looking at or I'm using in my r session has one of these package downsides? So there's a few ways you could probably get to that. One thing I typically check first is I'll search for the package name, and then typically, it will search be in your search result. But sometimes if it doesn't quite appear obvious, you go if it's on CRAN, you go to the CRAN entry, and then there will be a link in the URLs to that package downside if it is on CRAN. Also, if the package is on GitHub, usually, the package author will put the package downside as, like, the website associated with that repository.
Not all the time. Sometimes you have to do a little smoothing to get to it, but that that for discoverability, those are the things that I've done as well. But going back to the styling stuff, Mike, I wonder if our listeners may be wondering, well, in this blog post, they kinda knew to style these, like, dot navbar elements and the drop down menu elements. And for someone new to this world of styling, they may be wondering, how did they know to write that type of syntax? What would be your preferred method to discover those elements?
[00:16:02] Mike Thomas:
To discover CSS elements that we want to? Yeah. Yeah. Like, how would they know what to target when they wanna style the nav bar? How do you think they would get there? Yeah. It's a it's a good question. So, you know, the workflow that I typically have and it it takes some getting used to, and I imagine that this may be your workflow as well. But you could preview the site in your browser, and then you can, depending on sort of what operating system you're on, you should be able to right click in your browser and and find a, navigation to something called developer tools or inspect this site.
And that typically brings up this right hand sidebar that will show you all the HTML and CSS properties of the the site, that you're looking at. And if you sort of hover over, you know, some of the the different elements, in that sidebar, it should show you on the left hand side of your screen, sort of highlight, the related area on the UI that that CSS element relates to. And you should be able to sort of expand and collapse, these different CSS properties to be able to take a look at, the components of them, the parameters of them that you can configure. And once you are able to do that, you can get in a little bit of a workflow of, understanding which element you need to target and then naming that correctly in your custom CSS file and overriding, you know, or updating the particular property that you'd like to change.
[00:17:33] Eric Nantz:
Awesome tip there. And I I wanted to make sure that, you know, when I was getting into a lot of the shiny, my customization stuff many years ago, and I would see these fantastic looking apps, and this is well before b s web came in the picture. It just seems such magic to me. How did they do that? How did they know what to get to? And since then and since reading some great resources like, you know, the outstanding user interface that was Shiny by David Grange and others, there there was always one common theme that in in web resources, wherever it's shiny, wherever it's packaged down or just quarto in general or anything that you build of HTML, CSS is behind it one way or another. And one of the best ways to get there is to get into the system, as I say, with the dev tools or the dev console.
And once you get the hang of it, it does take a little gain used to buy. I just wanted to reassure our listeners out there that may be new to this that it may seem really cryptic at first. No. Just start simple. Just style one thing. Try it out. The best thing about those dev consoles is that you can do it all this interactively, it's not gonna break your actual code. Right? It's just literally in that preview. There's no there's no real cost. It's like a sandbox to try things out. It always seemed like, oh my gosh. I'm changing it here. How do I know what I do over there? So it it just takes some getting used to it. But I thought it was a nice touch in this post because I don't often see this with package down sites, this level of detail. So really great post here to to walk us through it.
[00:19:12] Mike Thomas:
Absolutely. No. That's a good that's a good tip. Definitely, don't be scared that anything that you change in your developer tools is going to be stuck forever.
[00:19:35] Eric Nantz:
So we were just talking about package documentation and a lot of times, especially those packages that are offering a statistical methodology, maybe a new modeling, you know, algorithm, you know, under the hood. Just like anything in stats, there's a lot of math behind it. And a lot of times, especially when you look at things like a package of vignette or research paper associated with it, You've got yourself some mathematical notation and that is usually written with LaTeX. Yes. I had to write my dissertation with LaTeX. It was not fun. I have to be honest with it. I knew enough to be dangerous, but not much after that.
And I've been away from writing LaTex for quite a while until literally about a few weeks ago when I was telling Mike in the preshow, I'm working on a shiny app that's, you know, surfacing some Bayesian methodologies under the hood. And I am a Bayesian novice. I know very just the very high level details of how things work. We were trying to solve a thorny issue with one of our distributions not getting the right, you know, probability of success where I was getting nonsensical results and the probability is going up out of the range of zero and one. And I look at the code, and I'm realizing I really don't have an idea of how some of this works. I've gotta get I gotta write this out to see where the problem is.
So I wrote LaTeX in a portal doc, and that brought back some memories, folks. And LaTeX, whether you like it or perhaps even don't like it, is the tried and true standard for mathematical notation. And I'm talking about right now, I've been talking about in the context of documentation, but there are a lot of times, especially in statistical teaching, whether it's in grad school or even with con maybe an application or a web page that is surfacing a visualization of, say, a model type. You wanna actually inject, if you can, that mathematical notation of that distribution or that probability function into that visualization to help the reader, you know, maybe learn along the way.
If you've ever wanted to do that in R itself, there may have been some approaches in the past, but this next highlight here has a pretty fascinating approach to make all this happen. And this is called the x d v I r package. Maybe it's pronounced x divir? No idea. But it is authored by Paul Merle, who I believe is a member of the r core team and has been one of the front runners, the trendsetters of leveraging the grid graphics system in r ever since its inception. So talk about, you know, the tried and true and and high quality package that that this is coming into play here.
And so we're gonna talk a little bit about what this package actually does, and we'll have a link to the vignette we're about to narrate here in the show notes if you wanna follow along afterwards. But the main purpose of x divr is to let you put latex snippets in your r plots. Notice I said plots. I didn't say specifically which type of plot because x divver actually supports all of them. That's an achievement in and of itself. So under the hood, what's really happening here? Well, LaTeX, as some of you may know, is a type of markup language for, again, mathematical notation and very precise way of of of defining all that.
So in order to put these equations on, you need three things to make this happen or this notation onto a plot. You need the language itself, LaTeX. You need the typesetter, which is gonna help determine how this LaTeX notation is gonna be translated into the the glyphs and the fonts that go into these plots and then to actually render it to the location that the user desires to put this into the plot itself. So the x divr package is automating a lot of that manual step in those three different stages. So first, you define a character vector. You can call it anything you want, but inside that vector, you've got yourself the LaTeX notation right there.
So once you define that and then you can, you know, probably do what I did. I had to look up a bunch of LaTeX cheat sheets a couple years ago to figure out how do I do the integral? How do I do the the sigma summation? It was, flashbacks. Flashbacks. Lots of flashbacks to that. So in this, example of the vignette here, he's got the notation for the standard. Looks like the normal distribution of my, spy correctly. And then once you have that ready to go, you could then choose from whatever plot method you're using how, you know, with how to put it in.
And the package comes with a function in the case of lattice graphics, a grid dot latex function where you give it that latex code and then the x and y coordinates of where you're putting it. And this is also leveraging some additional packages such as, of course, the grid package itself to help you with that positioning. And then once you have that, as you see here, there's a there's a plot in the in the vignette of the standard normal distribution with the CDF, I believe, or the PDF of the of the function in the upper left corner, and it looks really sharp. We're very legible.
Not like it's, a fuzzy copy pasted thing that you would do manually. It looks like the grid system should. And you can do this, of course, with other types of plots, such as a traditional plot. You can do this also with the grid dot latex function. If you just did a standard plot of, I say, y and x on the y and x axis, you can use the grid dot latex function. Again, very similar paradigm as we just saw with lattice. And last but not least, you can also do this with ggplot too. Many of our listeners love using ggplot too for their visualization. So the x divr package comes with additional functions where you can put in a LaTeX GROB, into this, ggplot object.
And, again, the parameters are the same, the LaTeX code, x and y coordinates, and the justification. Lots of interesting ways you can do this. There are even other helpers for the g g plot two users out there, such as element underscore latech, where if you wanna actually put this as labels such as in your title or whatnot, you can throw it in there too along with and, you can also do it for styling symbols in your plot such as the points on a scatter plot as an example of that as well. Putting, like, the notation for the sample average with subscripts on selected points.
I mean, that's awesome. It works. I get it works, and it looks like it's a pretty easy way to opt right into it. So there are some things you wanna know as if you wanna embark on this with your your next visualizations. We were just talking about the high level, but in order to pull this off, the machinery that need is needed for it is, of course, a Waytech distribution on your system. A lot of people like to use, the tiny tech distribution that Ewasea authored many years ago, as a very small fit for purpose, version of this, but you could use other ones like x e text. I'm not sure how to say it or Lua text. There's a bunch of other ones you could use with it too.
And then once you have that distribution, that might be the only real prerequisite you need. But knowing what's happening under the hood, the the link that we have in the in the show notes is the more verbose, manuscript that Paul has authored about these different stages of the processing going from the authoring step where it talks about what does it surround the LaTeX code with to make it a standalone LaTeX document that is ready for further processing, how it's converted into the DVI intermediate object that latex often does in that in between a going from the raw latex code to your output object, which may be a PDF or in this case, a snippet that will go into a grid or a graphic later on, how it's dealing with typesetting, how you might debug this when things don't work because oh, no. We did dive a lot of late nights trying to debug Latex when I was compiling that dissertation before the deadline, figuring out why the heck you know what? It's just like a shiny. Missed a bracket.
Missing brackets in latex is not fun because it's not as friendly as saying you miss a bracket. The the error messages are quite cryptic. So there's some narrative about that as well along with the different customizations that you can do with rendering, the the the the resolution and whatnot, doing multiple pages. There's there's a lot under the hood here. So, again, I only know know enough about latex can be dangerous, but next time I do in a static graph that has a statistical concept that's surfacing, I'm gonna give x divver a try and see see what I can do.
[00:29:21] Mike Thomas:
Absolutely. And I feel like these are the packages that don't get as much credit as they probably deserve. And it sounds like Paul has Paul Mural has really broken a ton of ground in this space over the years. So I think this is an appropriate time to give Paul a a huge shout out and a huge thank you for all the work that he's done in this space to get those of us who take it for granted, to get our plots to render the way that we want them to render. Right? Which is not a trivial task, the more and more that I read up about it. And, you're you're exactly right, Eric. You know, Paul talks about some of the related packages that are out there because we do have a lot of choice now in the open source software ecosystem. And that, X Divr, you know, provides a lot of similar functionality to grid text, g g text, the Marquee package, which I think is one of the more recent ones by Thomas Lynn Peterson, and some other folks.
But sort of the main difference between X Divor and those packages is that, the others are are built on markdown, while X Divor is built on LaTex, and in Paul's words, with all of the joy and pain that that brings. Definitely my favorite, yeah, excerpt from this this vignette. And it couldn't be more true because I think LaTeX allows us to do some of the customization that we we sometimes really need, and and have that ability to really, for the most part, in my experience, you know, fully customize whatever we want to show on screen in our PDFs, wah wah wah, as opposed to, other methodologies. But you're exactly right. It can be cryptic in terms of the error messages. The syntax is, I I think, objectively not the prettiest to look at.
And, you know, it's just one of those things. And I I really did also appreciate, as as you mentioned, the final section of the vignette around troubleshooting, where it's a little bit of a a deep dive into the weeds on how Xdivo relies on on the glyph rendering system that was added to our version 4 dot 3.o. So fairly recent developments, it it seems like made this possible, and a little bit of a discussion about the particular graphics devices that you will have to have available and set and ensure that you have a tech installation. Right? And a lot of us use tiny tech, for that purpose. That's one. I would recommend, I think that's an e way project, that is another unsung hero. Even if it is sung, it's not sung enough, in my opinion. Well, most said. Yeah. This this is a really good this vignette provides a really great walk through of the the more detailed, highlight. So feel free to check out both.
[00:32:05] Eric Nantz:
Yeah. And like I said, I've been in the wheeze a bit with this other app of surfacing this information, and we do have to, at some point, build in a feature to get more of the static outputs out of the app for, you know, further, you know, discussions of a team or the file as a record for, you know you know, prosperity, if you will. So I may end up actually using this in that app and when we do, like, a GGPOT two version of that distribution type. Because when we're mapping out these prior distributions that we're letting the user customize in this app. We indeed do have a section in the app that literally plots out that distribution function for the prior such as a normal distribution or other other ones for hazards and whatnot. So, yeah, I'm gonna I'm gonna give this a play and see see what happens.
And there's a lot more to play with in this issue as well. We got a lot of great, great additional resources that Batool has put together for us in this issue. So it'll take a couple minutes for our additional finds here. I'm gonna get into the architecture weeds a little bit on this, but it's a great achievement nonetheless. A a recent blog post from Dirk Edebutel has announced that the r two u repository now serves Arm 64 versions of our packages. Woo hoo. So you may be wondering why is this a big deal? Well, as he points out in his blog, a lot of the major cloud infrastructure is starting to move their servers to arm to save on cost, save on power usage, and in some cases, especially in the Apple space, a lot of the newer our pet laptops are powered by arm processors, such as the m one chips and and the like.
ARM hasn't been like a new thing for the our ecosystem, but yet this is a very new thing for the Linux users out there that are leveraging ARM processors, whether it's through their systems or through containers, and also through GitHub actions. That's how all this kinda came to be is, more recently, Microsoft, who, of course, owns GitHub, has now let users target a version of Ubuntu with ARM architecture and not just a typical x eighty six sixty four architecture that most Intel and AMD processors are using. So he does mention that there is a lot of support right off the bat. I think more than three quarters of the packages are in binary format right off the bat because they didn't take compiled code. The ones that are compiled, I think he says about 4,500 of the 5,000 are supported in this repository, so there's still a handful to go.
But he says that it it is it is on the way to to being available for everybody. So congratulations to Dirk and the team behind RTU. This is a massive achievement in the space of package, architecture in the R ecosystem.
[00:35:07] Mike Thomas:
Thank you for for calling that one out. If if you didn't, I was going to, Eric. Yeah. It's I can get to the punch. Yes. Big news for us. We're heavy, heavy users of the r two u project, albeit, you know, primarily, in Docker Linux environments. But I think this is still, you know, super important for some of those use cases that we do have that where this will show up. So awesome achievement, and and thanks to Dirk and everyone else for their work on that project in general. I did want to shout out a package that I admittedly have not used yet, but I acknowledge is absolutely awesome. And it's the tidy plots package, by Jan Engler, who's who's out of Germany.
And there is a new website, out there called tidy plots use cases that he's put together, with a bunch of beautiful charts and a little collapsible code snippet above it. And the conciseness of the code that we're able to write, the r code that we're able to write to generate these beautiful plots is pretty mind boggling. So I would recommend just checking out this site. It's it's very visual. It's really all charts with a little collapsible code snippet above it, and I am going to guess that, like me, it will it will blow you away. So pretty pretty cool project, and I I think one to to check out. And maybe this is a very easy introduction into it. And I I really applaud him for taking the step of creating this particular site, this particular page, because I I think it, you know, really demonstrates this package very, very well and was a great sort of eye opening moment for me to then dive into the GitHub and the package down site and take a look a little deeper at the tidy plots package.
[00:36:59] Eric Nantz:
This is fantastic. Oh, so glad you called this out. I literally see things here that, a, would have been very useful for me about eight or nine years ago when I was doing a lot of bio biomarker analysis. There's a section for bioinformatics plots here, such as the the famous volcano plot for p values and fold change and correlation, like, in heat maps. Oh, my goodness. This tidy plots packages is really top notch to get you up and running really quickly. I'm gonna assume it's building upon ggplot too because it looks very, very similar, but yet look at the lines of code. It takes maybe just three or four lines to get a lot of these these these plots up and running. So, okay, this is bookmarked for sure. This is this is a great resource.
And, again, for any visualization type of package or utility that you put out there, this is a great template to draw upon. The code's available on demand, and you quickly grab to what visually pleases you the most, then grab that and run with it. Really good find here.
[00:38:05] Mike Thomas:
Yeah. I'm seeing in the description file, I'm seeing g g plot two, g g b swarm, g g pover, g g raster, g g repel. The HMISC package, I think that might be bioinformatics
[00:38:18] Eric Nantz:
related if I'm thinking of some Oh, that that's from Frank Harrell. Frank Harrell, has read an HMISC package for many years, which I remember using because I had one of his textbooks on, survival analysis. And so he had some his own functions in there. So he's been maintaining that for probably over twenty years at six.
[00:38:39] Mike Thomas:
I was trying to remember where I I had seen that one before. And the Patchworks in there as well. Oh, yeah. So yeah. We got all that the visualization heavy hitters.
[00:38:49] Eric Nantz:
Excellent. So again, these are the things that you see every week on our weekly. There's always something new that Mike and I will look at. We're like, did you know that was there? No. I didn't. Well, we do now. So
[00:39:01] Mike Thomas:
tidy plots is another one. We're gonna yeah. You deserve your flowers, whoever authored Tidy Plots. This is I'm gonna take the mask off and let the audience know that we do this as much for you as we do for ourselves to keep up to date with what is going on in the our ecosystem.
[00:39:17] Eric Nantz:
I tell everybody, whoever it's at the day job or people I talk to on social, you know, online, that our weekly is one of the best things I ever discovered because I've learned so much in these issues. So being able to talk about and then learn from it. Yeah. Can't have it any better. Can't have any better. Like I said, look at the back catalog of our weekly as well. There is a lot more to choose from. And, I know, Mike, you've been looking at the back catalog when you were timed off and saw some great things. So, yeah, there's a lot a lot that happens every week. So where is it, you ask? Is that r week without r r g? Of course. You should have that in your bookmarks, and it's ready for you right there to, to read at your leisure.
And, also, it is a community project, so we invite you to help out in whatever way you can. Our weekly has always been kind of under this value for value mindset. If you get value from it, we would love to have value back. If it is just your time and sharing it with your colleagues, sending us a poll request of that great new resource that you found online, we'd love to hear about it there too. And, also, in the case of this very podcast, we have a fun little boost mechanism. If you wanna send us a little fun along the way, we have links on how you can do that as well. But, also, we'd love to hear from you on our social media channels. So if you wanna get in touch with us, you can find me on blue sky with at [email protected].
Also on Mastodon where I'm at [email protected], and I'm on LinkedIn. You just search my name, and you'll find me causing all sorts of fun there. And, Mike, where can the listeners find you?
[00:40:55] Mike Thomas:
Primarily on Blue Sky these days at, Mike dash Thomas dot b s k y dot social or on LinkedIn. You can find me if you search Ketchbrooke Analytics, k e t c h b r o o k.
[00:41:09] Eric Nantz:
Yeah. Hard to believe. We just finished one ninety eight. We got two more to go before the big one. So as I keep saying, if you have a great memory of what we talked about in the past and what you find most enjoyable about our weekly, we love to hear about it. So definitely get in touch with us on social media. We also have a contact, page that's linked to in the show notes if you wanna share your feedback too. We'd love to read it on the air for episode 200. We'd love love to hear from you. With that, we're gonna close-up shop for episode 98. Thank you so much for joining us today, and we'll be back with episode one ninety nine of our weekly highlights next week.
Hello, friends. We're back of episode 98 of the Our Wicked Highlights podcast. Boy, that magic number is getting closer, isn't it? But, nonetheless, this is the weekly show where we talk about the awesome highlights and other resources that are shared every single week at rweekly.0rg. My name is Eric Nance, and I'm delighted to join us from wherever you are around the world. Yes. I am very thankful. We're already in March, so I feel like my mood's lifting up. The sun's coming out more. The time zone chain or the time change a little bit in my neck of the woods, so it's lighter later in the day. Although it's darker in the morning, so give can't have everything at once. But I'm happy, and all things considered. And I'm also happy because returning to the show is my awesome co host, Mike Thomas.
[00:00:49] Mike Thomas:
Mike, we're so glad to have you back. I barely, flew the ship solo last time, so thank you for being here. Well, that's not true at all. I listened to the the great solo episode you recorded, and I appreciate you you doing that and carrying the water for us, for that week. Well, I was out for a couple weeks, unfortunately, but very happy to be back. Thank you to the OurWeekly
[00:01:11] Eric Nantz:
folks for starting me off on a lighter issue this week to ease me back into the OurWeekly groove. Yep. Sometimes we need that little bit of easing into it. But, yeah, let's dive right into it. And our curator this week is Matol Amrazak. And as always, she had tremendous help from our fellow, our weekly team members, and contributors like all of you around the world with your poll requests and other great suggestions. And we're gonna give a little personality to a great topic that, you know, has been one of my, big focuses in the last couple years of a lot of internal projects, but that, of course, is writing good and concise documentation for your packages.
And what better way to surface that information to your end users, whether you're doing something internally or you're releasing this to the the open source world, if you will, is to create a website for your package in r. And there is a package that's been kind of the front runner in this space for quite a few years now, and we are talking about the package down package or p k g d o w n. I always wonder how people say these things, but I'm gonna say package down and run with it. But I've been using it for years, and out of the box, you get a a nice utility type of experience, but there's a lot more you can do with it, folks. And that's where our first highlight comes in and is being authored by Muriel Del Mote. Hopefully, I've said that correctly.
They're from ThinkR. So this is from the ThinkR blog, which is titled customize your expedition, creating unique documentation for your r package. So we first start off in this post with how straightforward it is to get one of these up and running if you leverage one of my favorite utility packages for development in the r space is the use this package. It seems like there's almost nothing use this can't do other than literally just, you know, fix all my errors at once, but maybe that will come later on. But you can bootstrap a, package down site with use this colon colon use underscore package down. When you have your package already configured with your r functions and their function documentation, This is gonna give you that basic structure right off the bat, and you can literally build your package site interactively right then and there with package down build underscore site.
And there you go. You now have if you run a render that in a preview in your editor of choice, you've got a, very, again, utilitarian looking package site right off the bat, which will give you an index of your functions and the reference tab. And if you had any vignettes that you wrote as part of your package, they will also be included in the article section of the package down site. And last but certainly not least right off the bat, if you are using a news file to keep track of your package updates and changes, that will be in the change log section of your package down site.
I visit that section every single time I'm using a package because there's a lot of things the fast moving, packages I'm leveraging lately, and I always like to see just what happened in that update compared to our previous version, especially if you're dealing with, you know, more, how should I say it, like older r versions with older package repositories. And then you start to leverage, like, a development project with RM, then you realize you've updated a new version of that package. So what changed between those two versions? You know, the change log can be helpful in that approach.
So, again, right off the bat, you got a package down site ready to go. But as this blog post is saying, there's a lot more you can do here. Like a lot of things that are fundamentally based of our markdown, which package down is, I'm sure, very much inspired by, A lot of the configuration in your site is driven by a YAML file. The package down dot YAML file to be exact. I swear, I now understand why some people keep saying YAML is almost like a programming language in and of itself because there's so many things that we can figure with it. Right? So we're gonna walk through, in the case of package down, some of the options that you have here, starting which is the navigation bar. I told you what happens by default, but you could organize this any way you see fit with the structure, element.
You can determine what goes on the left, what goes on the right, let's say those icons to say the GitHub repository of your package. You can put any other icon or reference that you want in there. There's a lot more you can do with that. And then to really look at how those navigation elements are constructed, you can have components within each of them. So in the example we see here, they've added a new section called expeditions. And within that expedition section, it's got its own submenu of different elements. This may look familiar because if you've been using portal lately, you can kinda take a similar approach when you make a website with portal and how you construct the navigation bar there as well. So there is interesting some synergies here. It's not always synergistic, but there's some interesting interesting things you can take from constructing, say, a portal site and whatnot. So there's a menu section where they have the hyperlinks and then the text for that hyperlink.
Again, once you grok it and look at it for a little bit, makes makes a lot of sense. And you can also, again, make sure that if you want to space things out, which is a nice navigation feature, there's a little tip that I saw. You can have arbitrary text in these menus and just put a bunch of hyphens in it. Now it gives you a nice little separator between the the different groups if you wanna organize it in that fashion. So little things along the way that I learn every time I read these posts. The other meat of a package down site is typically the function documentation. This is where literally, every day I'm developing or, an internal package or I'm leveraging an internal pipeline that's using a bunch of packages.
I have on the right side of my screen the package downside of that other package, and I'm looking at the documentation right off the bat. Typically, in our studio, you could look at the documentation in the editor itself. Lately, I've been using Positron, and they don't have that nice little help browsing panel just yet. You have to do question mark name of the function, and then it appears. So sometimes I just like having the web browser on the right side with the package documentation right there. That way I can just kind of navigate back and forth. So back to where we're talking about here, in package down site, there is a reference section.
And this is where you can kinda give categories or groupings of your functions without necessarily having to do that in your package documentation itself in the package. Case in point, in the example here, they've got a reference section and with different titles for these groups in the title argument and then a description that gives a more plain text kind of one or two sentence description of these contents. And then inside the contents section are the names of the functions that go into that section. This is really, really important, especially to me as you're developing a package that may have twenty, thirty, sometimes 40 functions, but there there are inherent groupings that you want to use it to quickly understand what certain maybe they're pinpointing an output type of function or a function that's processing data or whatnot.
So a lot of the package downsides that I've seen lately are using this grouping feature really effectively. I'm really appreciative that this is easy to do now in package down to give that more enhanced user experience, browsing the functions in your package. So it's really getting the title, the description, and then the contents being the names of the functions going into that group. And then once you rerender that, you would see then in your reference section, you've got really nice easily, you know, viewable groups for these functions.
All of this at this point has been the structure of your site and literally the elements inside of it. We've left out one important detail that, Mike, you're gonna rectify now. Let's bling this up a little bit, shall we? Let's make it pretty for sure. And that's where,
[00:10:18] Mike Thomas:
you know, CSS and and HTML and and Bootstrap can come into play here. And the folks who have authored the package down package for us, again, we can configure all of these things in this really nice YAML document, underscore package down YAML file as you mentioned. And there's a section that you can add, called template. And underneath that section, there's there's two immediate, subparameters, which are bootstrap, which, specifies the version of bootstrap that you want to use. Typically, nowadays, I think five is the the latest and greatest and has been for some time. Who knows when bootstrap six will drop? And then you can specify, some parameters as well. And, a lot of folks, in my experience, find the boots watch, project to be very handy because it is a set of I don't know how many there are today, Eric. Maybe 30 ish or so, I feel like. Yeah. I gotta say it's at least 20 of these boots watch themes that I've seen in the wild. Yeah. Yeah. Sort of pre specified, different themes in terms of, you know, what color is is the nav bar, the background, the foreground, font, styling, all sorts of things like that. So you can play around with the the 20 or 30 sort of preset options that they provide you, which can be a nice starting point. Maybe that that works perfectly. You find one that that looks exactly like you want.
Or if you wanna go even a little bit further, you can add an extra dot CSS file, in your packages package down directory, and that can help update the CSS properties of your package down site, which is really cool. You can set things like the the navbar colored, you know, navbar, drop down links, you know, the font weight, all sorts of different types of things, any sort of CSS element on your package down site that you might wanna customize, you can handle with this extra dot CSS file, and that allows you to leverage whatever sort of branding you want to incorporate or or anything like that that you think is going to make your package down site pop or fit into sort of the theme and styling that you had in mind for your package. Package down, you know, it's awesome. It's something that we have in the R ecosystem that I think has completely exploded.
I think we've talked about this before, but when I go to a to look up in our package that doesn't have a package down site, it's really sad, because I have to maybe go through, you know, one of the old PDFs, that, you know, is published out there when a package gets published on. Korean, and that's that's certainly not as fun to explore as, you know, an interactive HTML package down site that has the searchability and all those types of cool things. And and maybe one of these days, package down and and web r will collide, and we can run some of these functions, from a package that we're exploring in the browser at the same place where we're looking at the documentation. That is is one of my dreams. We'll see. On a sort of similar tangent, I've been using a lot of, polars from the Python ecosystem lately.
It's I'd call it maybe a dplyr equivalent with a lot of speed improvements, but in terms of syntax, it's it's fairly similar to what you might be familiar with in the tidy verse. And, has a a great site that looks kind of like a package down site, maybe even a little bit more robust because of everything that's going on in that package. And I noticed this week that there's a new little new little icon in the bottom right corner that's in an AI chatbot that allows you to to ask a question of the documentation. And my hope is that it's it's maybe rag based or, you know, fully up to date with the latest developments in the polars, package so that we are not doing the thing where I go to chat g p t, which was trained on polar's six months ago.
And I am writing legacy code sometimes as opposed to, you know, writing the correct code based upon where the the package and the project stands at this point in time. So maybe we'll see something like that come to, the package down ecosystem as well, but I thought that was an interesting and relevant anecdote.
[00:14:41] Eric Nantz:
Oh, yeah. That that is intriguing for sure. And, you know, our listeners might be out there wondering, well, how do I actually know if this package I'm looking at or I'm using in my r session has one of these package downsides? So there's a few ways you could probably get to that. One thing I typically check first is I'll search for the package name, and then typically, it will search be in your search result. But sometimes if it doesn't quite appear obvious, you go if it's on CRAN, you go to the CRAN entry, and then there will be a link in the URLs to that package downside if it is on CRAN. Also, if the package is on GitHub, usually, the package author will put the package downside as, like, the website associated with that repository.
Not all the time. Sometimes you have to do a little smoothing to get to it, but that that for discoverability, those are the things that I've done as well. But going back to the styling stuff, Mike, I wonder if our listeners may be wondering, well, in this blog post, they kinda knew to style these, like, dot navbar elements and the drop down menu elements. And for someone new to this world of styling, they may be wondering, how did they know to write that type of syntax? What would be your preferred method to discover those elements?
[00:16:02] Mike Thomas:
To discover CSS elements that we want to? Yeah. Yeah. Like, how would they know what to target when they wanna style the nav bar? How do you think they would get there? Yeah. It's a it's a good question. So, you know, the workflow that I typically have and it it takes some getting used to, and I imagine that this may be your workflow as well. But you could preview the site in your browser, and then you can, depending on sort of what operating system you're on, you should be able to right click in your browser and and find a, navigation to something called developer tools or inspect this site.
And that typically brings up this right hand sidebar that will show you all the HTML and CSS properties of the the site, that you're looking at. And if you sort of hover over, you know, some of the the different elements, in that sidebar, it should show you on the left hand side of your screen, sort of highlight, the related area on the UI that that CSS element relates to. And you should be able to sort of expand and collapse, these different CSS properties to be able to take a look at, the components of them, the parameters of them that you can configure. And once you are able to do that, you can get in a little bit of a workflow of, understanding which element you need to target and then naming that correctly in your custom CSS file and overriding, you know, or updating the particular property that you'd like to change.
[00:17:33] Eric Nantz:
Awesome tip there. And I I wanted to make sure that, you know, when I was getting into a lot of the shiny, my customization stuff many years ago, and I would see these fantastic looking apps, and this is well before b s web came in the picture. It just seems such magic to me. How did they do that? How did they know what to get to? And since then and since reading some great resources like, you know, the outstanding user interface that was Shiny by David Grange and others, there there was always one common theme that in in web resources, wherever it's shiny, wherever it's packaged down or just quarto in general or anything that you build of HTML, CSS is behind it one way or another. And one of the best ways to get there is to get into the system, as I say, with the dev tools or the dev console.
And once you get the hang of it, it does take a little gain used to buy. I just wanted to reassure our listeners out there that may be new to this that it may seem really cryptic at first. No. Just start simple. Just style one thing. Try it out. The best thing about those dev consoles is that you can do it all this interactively, it's not gonna break your actual code. Right? It's just literally in that preview. There's no there's no real cost. It's like a sandbox to try things out. It always seemed like, oh my gosh. I'm changing it here. How do I know what I do over there? So it it just takes some getting used to it. But I thought it was a nice touch in this post because I don't often see this with package down sites, this level of detail. So really great post here to to walk us through it.
[00:19:12] Mike Thomas:
Absolutely. No. That's a good that's a good tip. Definitely, don't be scared that anything that you change in your developer tools is going to be stuck forever.
[00:19:35] Eric Nantz:
So we were just talking about package documentation and a lot of times, especially those packages that are offering a statistical methodology, maybe a new modeling, you know, algorithm, you know, under the hood. Just like anything in stats, there's a lot of math behind it. And a lot of times, especially when you look at things like a package of vignette or research paper associated with it, You've got yourself some mathematical notation and that is usually written with LaTeX. Yes. I had to write my dissertation with LaTeX. It was not fun. I have to be honest with it. I knew enough to be dangerous, but not much after that.
And I've been away from writing LaTex for quite a while until literally about a few weeks ago when I was telling Mike in the preshow, I'm working on a shiny app that's, you know, surfacing some Bayesian methodologies under the hood. And I am a Bayesian novice. I know very just the very high level details of how things work. We were trying to solve a thorny issue with one of our distributions not getting the right, you know, probability of success where I was getting nonsensical results and the probability is going up out of the range of zero and one. And I look at the code, and I'm realizing I really don't have an idea of how some of this works. I've gotta get I gotta write this out to see where the problem is.
So I wrote LaTeX in a portal doc, and that brought back some memories, folks. And LaTeX, whether you like it or perhaps even don't like it, is the tried and true standard for mathematical notation. And I'm talking about right now, I've been talking about in the context of documentation, but there are a lot of times, especially in statistical teaching, whether it's in grad school or even with con maybe an application or a web page that is surfacing a visualization of, say, a model type. You wanna actually inject, if you can, that mathematical notation of that distribution or that probability function into that visualization to help the reader, you know, maybe learn along the way.
If you've ever wanted to do that in R itself, there may have been some approaches in the past, but this next highlight here has a pretty fascinating approach to make all this happen. And this is called the x d v I r package. Maybe it's pronounced x divir? No idea. But it is authored by Paul Merle, who I believe is a member of the r core team and has been one of the front runners, the trendsetters of leveraging the grid graphics system in r ever since its inception. So talk about, you know, the tried and true and and high quality package that that this is coming into play here.
And so we're gonna talk a little bit about what this package actually does, and we'll have a link to the vignette we're about to narrate here in the show notes if you wanna follow along afterwards. But the main purpose of x divr is to let you put latex snippets in your r plots. Notice I said plots. I didn't say specifically which type of plot because x divver actually supports all of them. That's an achievement in and of itself. So under the hood, what's really happening here? Well, LaTeX, as some of you may know, is a type of markup language for, again, mathematical notation and very precise way of of of defining all that.
So in order to put these equations on, you need three things to make this happen or this notation onto a plot. You need the language itself, LaTeX. You need the typesetter, which is gonna help determine how this LaTeX notation is gonna be translated into the the glyphs and the fonts that go into these plots and then to actually render it to the location that the user desires to put this into the plot itself. So the x divr package is automating a lot of that manual step in those three different stages. So first, you define a character vector. You can call it anything you want, but inside that vector, you've got yourself the LaTeX notation right there.
So once you define that and then you can, you know, probably do what I did. I had to look up a bunch of LaTeX cheat sheets a couple years ago to figure out how do I do the integral? How do I do the the sigma summation? It was, flashbacks. Flashbacks. Lots of flashbacks to that. So in this, example of the vignette here, he's got the notation for the standard. Looks like the normal distribution of my, spy correctly. And then once you have that ready to go, you could then choose from whatever plot method you're using how, you know, with how to put it in.
And the package comes with a function in the case of lattice graphics, a grid dot latex function where you give it that latex code and then the x and y coordinates of where you're putting it. And this is also leveraging some additional packages such as, of course, the grid package itself to help you with that positioning. And then once you have that, as you see here, there's a there's a plot in the in the vignette of the standard normal distribution with the CDF, I believe, or the PDF of the of the function in the upper left corner, and it looks really sharp. We're very legible.
Not like it's, a fuzzy copy pasted thing that you would do manually. It looks like the grid system should. And you can do this, of course, with other types of plots, such as a traditional plot. You can do this also with the grid dot latex function. If you just did a standard plot of, I say, y and x on the y and x axis, you can use the grid dot latex function. Again, very similar paradigm as we just saw with lattice. And last but not least, you can also do this with ggplot too. Many of our listeners love using ggplot too for their visualization. So the x divr package comes with additional functions where you can put in a LaTeX GROB, into this, ggplot object.
And, again, the parameters are the same, the LaTeX code, x and y coordinates, and the justification. Lots of interesting ways you can do this. There are even other helpers for the g g plot two users out there, such as element underscore latech, where if you wanna actually put this as labels such as in your title or whatnot, you can throw it in there too along with and, you can also do it for styling symbols in your plot such as the points on a scatter plot as an example of that as well. Putting, like, the notation for the sample average with subscripts on selected points.
I mean, that's awesome. It works. I get it works, and it looks like it's a pretty easy way to opt right into it. So there are some things you wanna know as if you wanna embark on this with your your next visualizations. We were just talking about the high level, but in order to pull this off, the machinery that need is needed for it is, of course, a Waytech distribution on your system. A lot of people like to use, the tiny tech distribution that Ewasea authored many years ago, as a very small fit for purpose, version of this, but you could use other ones like x e text. I'm not sure how to say it or Lua text. There's a bunch of other ones you could use with it too.
And then once you have that distribution, that might be the only real prerequisite you need. But knowing what's happening under the hood, the the link that we have in the in the show notes is the more verbose, manuscript that Paul has authored about these different stages of the processing going from the authoring step where it talks about what does it surround the LaTeX code with to make it a standalone LaTeX document that is ready for further processing, how it's converted into the DVI intermediate object that latex often does in that in between a going from the raw latex code to your output object, which may be a PDF or in this case, a snippet that will go into a grid or a graphic later on, how it's dealing with typesetting, how you might debug this when things don't work because oh, no. We did dive a lot of late nights trying to debug Latex when I was compiling that dissertation before the deadline, figuring out why the heck you know what? It's just like a shiny. Missed a bracket.
Missing brackets in latex is not fun because it's not as friendly as saying you miss a bracket. The the error messages are quite cryptic. So there's some narrative about that as well along with the different customizations that you can do with rendering, the the the the resolution and whatnot, doing multiple pages. There's there's a lot under the hood here. So, again, I only know know enough about latex can be dangerous, but next time I do in a static graph that has a statistical concept that's surfacing, I'm gonna give x divver a try and see see what I can do.
[00:29:21] Mike Thomas:
Absolutely. And I feel like these are the packages that don't get as much credit as they probably deserve. And it sounds like Paul has Paul Mural has really broken a ton of ground in this space over the years. So I think this is an appropriate time to give Paul a a huge shout out and a huge thank you for all the work that he's done in this space to get those of us who take it for granted, to get our plots to render the way that we want them to render. Right? Which is not a trivial task, the more and more that I read up about it. And, you're you're exactly right, Eric. You know, Paul talks about some of the related packages that are out there because we do have a lot of choice now in the open source software ecosystem. And that, X Divr, you know, provides a lot of similar functionality to grid text, g g text, the Marquee package, which I think is one of the more recent ones by Thomas Lynn Peterson, and some other folks.
But sort of the main difference between X Divor and those packages is that, the others are are built on markdown, while X Divor is built on LaTex, and in Paul's words, with all of the joy and pain that that brings. Definitely my favorite, yeah, excerpt from this this vignette. And it couldn't be more true because I think LaTeX allows us to do some of the customization that we we sometimes really need, and and have that ability to really, for the most part, in my experience, you know, fully customize whatever we want to show on screen in our PDFs, wah wah wah, as opposed to, other methodologies. But you're exactly right. It can be cryptic in terms of the error messages. The syntax is, I I think, objectively not the prettiest to look at.
And, you know, it's just one of those things. And I I really did also appreciate, as as you mentioned, the final section of the vignette around troubleshooting, where it's a little bit of a a deep dive into the weeds on how Xdivo relies on on the glyph rendering system that was added to our version 4 dot 3.o. So fairly recent developments, it it seems like made this possible, and a little bit of a discussion about the particular graphics devices that you will have to have available and set and ensure that you have a tech installation. Right? And a lot of us use tiny tech, for that purpose. That's one. I would recommend, I think that's an e way project, that is another unsung hero. Even if it is sung, it's not sung enough, in my opinion. Well, most said. Yeah. This this is a really good this vignette provides a really great walk through of the the more detailed, highlight. So feel free to check out both.
[00:32:05] Eric Nantz:
Yeah. And like I said, I've been in the wheeze a bit with this other app of surfacing this information, and we do have to, at some point, build in a feature to get more of the static outputs out of the app for, you know, further, you know, discussions of a team or the file as a record for, you know you know, prosperity, if you will. So I may end up actually using this in that app and when we do, like, a GGPOT two version of that distribution type. Because when we're mapping out these prior distributions that we're letting the user customize in this app. We indeed do have a section in the app that literally plots out that distribution function for the prior such as a normal distribution or other other ones for hazards and whatnot. So, yeah, I'm gonna I'm gonna give this a play and see see what happens.
And there's a lot more to play with in this issue as well. We got a lot of great, great additional resources that Batool has put together for us in this issue. So it'll take a couple minutes for our additional finds here. I'm gonna get into the architecture weeds a little bit on this, but it's a great achievement nonetheless. A a recent blog post from Dirk Edebutel has announced that the r two u repository now serves Arm 64 versions of our packages. Woo hoo. So you may be wondering why is this a big deal? Well, as he points out in his blog, a lot of the major cloud infrastructure is starting to move their servers to arm to save on cost, save on power usage, and in some cases, especially in the Apple space, a lot of the newer our pet laptops are powered by arm processors, such as the m one chips and and the like.
ARM hasn't been like a new thing for the our ecosystem, but yet this is a very new thing for the Linux users out there that are leveraging ARM processors, whether it's through their systems or through containers, and also through GitHub actions. That's how all this kinda came to be is, more recently, Microsoft, who, of course, owns GitHub, has now let users target a version of Ubuntu with ARM architecture and not just a typical x eighty six sixty four architecture that most Intel and AMD processors are using. So he does mention that there is a lot of support right off the bat. I think more than three quarters of the packages are in binary format right off the bat because they didn't take compiled code. The ones that are compiled, I think he says about 4,500 of the 5,000 are supported in this repository, so there's still a handful to go.
But he says that it it is it is on the way to to being available for everybody. So congratulations to Dirk and the team behind RTU. This is a massive achievement in the space of package, architecture in the R ecosystem.
[00:35:07] Mike Thomas:
Thank you for for calling that one out. If if you didn't, I was going to, Eric. Yeah. It's I can get to the punch. Yes. Big news for us. We're heavy, heavy users of the r two u project, albeit, you know, primarily, in Docker Linux environments. But I think this is still, you know, super important for some of those use cases that we do have that where this will show up. So awesome achievement, and and thanks to Dirk and everyone else for their work on that project in general. I did want to shout out a package that I admittedly have not used yet, but I acknowledge is absolutely awesome. And it's the tidy plots package, by Jan Engler, who's who's out of Germany.
And there is a new website, out there called tidy plots use cases that he's put together, with a bunch of beautiful charts and a little collapsible code snippet above it. And the conciseness of the code that we're able to write, the r code that we're able to write to generate these beautiful plots is pretty mind boggling. So I would recommend just checking out this site. It's it's very visual. It's really all charts with a little collapsible code snippet above it, and I am going to guess that, like me, it will it will blow you away. So pretty pretty cool project, and I I think one to to check out. And maybe this is a very easy introduction into it. And I I really applaud him for taking the step of creating this particular site, this particular page, because I I think it, you know, really demonstrates this package very, very well and was a great sort of eye opening moment for me to then dive into the GitHub and the package down site and take a look a little deeper at the tidy plots package.
[00:36:59] Eric Nantz:
This is fantastic. Oh, so glad you called this out. I literally see things here that, a, would have been very useful for me about eight or nine years ago when I was doing a lot of bio biomarker analysis. There's a section for bioinformatics plots here, such as the the famous volcano plot for p values and fold change and correlation, like, in heat maps. Oh, my goodness. This tidy plots packages is really top notch to get you up and running really quickly. I'm gonna assume it's building upon ggplot too because it looks very, very similar, but yet look at the lines of code. It takes maybe just three or four lines to get a lot of these these these plots up and running. So, okay, this is bookmarked for sure. This is this is a great resource.
And, again, for any visualization type of package or utility that you put out there, this is a great template to draw upon. The code's available on demand, and you quickly grab to what visually pleases you the most, then grab that and run with it. Really good find here.
[00:38:05] Mike Thomas:
Yeah. I'm seeing in the description file, I'm seeing g g plot two, g g b swarm, g g pover, g g raster, g g repel. The HMISC package, I think that might be bioinformatics
[00:38:18] Eric Nantz:
related if I'm thinking of some Oh, that that's from Frank Harrell. Frank Harrell, has read an HMISC package for many years, which I remember using because I had one of his textbooks on, survival analysis. And so he had some his own functions in there. So he's been maintaining that for probably over twenty years at six.
[00:38:39] Mike Thomas:
I was trying to remember where I I had seen that one before. And the Patchworks in there as well. Oh, yeah. So yeah. We got all that the visualization heavy hitters.
[00:38:49] Eric Nantz:
Excellent. So again, these are the things that you see every week on our weekly. There's always something new that Mike and I will look at. We're like, did you know that was there? No. I didn't. Well, we do now. So
[00:39:01] Mike Thomas:
tidy plots is another one. We're gonna yeah. You deserve your flowers, whoever authored Tidy Plots. This is I'm gonna take the mask off and let the audience know that we do this as much for you as we do for ourselves to keep up to date with what is going on in the our ecosystem.
[00:39:17] Eric Nantz:
I tell everybody, whoever it's at the day job or people I talk to on social, you know, online, that our weekly is one of the best things I ever discovered because I've learned so much in these issues. So being able to talk about and then learn from it. Yeah. Can't have it any better. Can't have any better. Like I said, look at the back catalog of our weekly as well. There is a lot more to choose from. And, I know, Mike, you've been looking at the back catalog when you were timed off and saw some great things. So, yeah, there's a lot a lot that happens every week. So where is it, you ask? Is that r week without r r g? Of course. You should have that in your bookmarks, and it's ready for you right there to, to read at your leisure.
And, also, it is a community project, so we invite you to help out in whatever way you can. Our weekly has always been kind of under this value for value mindset. If you get value from it, we would love to have value back. If it is just your time and sharing it with your colleagues, sending us a poll request of that great new resource that you found online, we'd love to hear about it there too. And, also, in the case of this very podcast, we have a fun little boost mechanism. If you wanna send us a little fun along the way, we have links on how you can do that as well. But, also, we'd love to hear from you on our social media channels. So if you wanna get in touch with us, you can find me on blue sky with at [email protected].
Also on Mastodon where I'm at [email protected], and I'm on LinkedIn. You just search my name, and you'll find me causing all sorts of fun there. And, Mike, where can the listeners find you?
[00:40:55] Mike Thomas:
Primarily on Blue Sky these days at, Mike dash Thomas dot b s k y dot social or on LinkedIn. You can find me if you search Ketchbrooke Analytics, k e t c h b r o o k.
[00:41:09] Eric Nantz:
Yeah. Hard to believe. We just finished one ninety eight. We got two more to go before the big one. So as I keep saying, if you have a great memory of what we talked about in the past and what you find most enjoyable about our weekly, we love to hear about it. So definitely get in touch with us on social media. We also have a contact, page that's linked to in the show notes if you wanna share your feedback too. We'd love to read it on the air for episode 200. We'd love love to hear from you. With that, we're gonna close-up shop for episode 98. Thank you so much for joining us today, and we'll be back with episode one ninety nine of our weekly highlights next week.