ohai.social is one of the many independent Mastodon servers you can use to participate in the fediverse.
A cozy, fast and secure Mastodon server where everyone is welcome. Run by the folks at ohai.is.

Administered by:

Server stats:

1.8K
active users

#machine_learning

0 posts0 participants0 posts today

In my previous job I was in charge of data collection for our #machine_learning.

• We sent people around the world to collect handwriting samples
• We implemented a "help us improve" pipeline in our apps: *opt-in*, in a way that respects GDPR, to be able to expunge data when users change their mind
• We respected the licenses of academic databases

The ethics there should be chanted. *Management supported doing things right*: it's not because it's there that you can train on it.

@timnitGebru

Going to #EGU25 and working on #machine_learning in #geoscience ?
Submit your abstract to our PICO session focussing on strategies and applications of AI and ML in a spatial and spatio-temporal context (meetingorganizer.copernicus.or) or
the session on application for large-scale mapping of environmental variables by combining ground observations, remote sensing, and machine learning (meetingorganizer.copernicus.or)

meetingorganizer.copernicus.orgSession ESSI1.2

(continuing my exploration of #claude )

I am also quite impressed by Claude's ability to produce interactive little web programs based on the user's prompts, and re-adjust them when needed during the conversation.

Here is a simple example, starting with the prompt

Can you produce an interactive plot. The user should be able to enter into a text field a mathematical formula in one variable x using reverse polish notation. This formula will then be evaluated and the result displayed in the plot. It is sufficient to have the basic arithmetic operations and the sin function in addition.

This already produced a perfectly workable solution, and all the subsequent prompting was only for fine-tuning some details of the graphics.

See the result here:

claude.site/artifacts/20a5ad3b

Continued thread

(technical discussion about Gaussian processes and Claude, continued)

Claude first presented to me the fact that Matern kernels can no longer be defined to match the required high wavenumber behaviour starting at d=4. I then asked it about the short-distance behaviour in real space and pushed back a little, claiming that I do not see why a 1/r^2 kernel in real space is so bad - after all, I could apply it to a smooth function and be fine, since the volume element may save me. But Claude gave me a very convincing argument why we do really run into trouble, and it connected the problematic behaviour to upper critical dimensions in statistical physics.

Quoting Claude: "In statistical field theory language, this would be described by saying that the operator (∇²f)² becomes "irrelevant" above four dimensions - its effects get washed out by the ultraviolet (short-distance) singularities of the theory."

After this fascinating discussion I then came back to my original goal, of having a suitable generalization of Bezier fitting in higher dimensions, and Claude introduced me to 'Thin Plate Splines' and other options, nicely connecting it back to our discussion about the short-distance behaviour of Gaussian processes in higher dimensions.

So, overall, I am very impressed with Claude!

[and hurray for the new 1500 character limit on fediscience]

2/2

#AI#chatbots#Claude

I have been consistently skeptical of chatbots, but during the past two days I have been extremely impressed by Claude (the Anthropic chatbot), so I am officially revising my opinion!

I just had a chat with Claude about Gaussian random processes and how to use them to do function fitting. The following discussion is a bit technical, but the short summary is that Claude really taught me new things, like a graduate student who happens to have studied the literature in a particular topic better than I have had the chance to and who actually understands what they are talking about.

I explained to it that I wanted to implement a prior given by the exponential of the integral over the Laplacian squared in two dimensions. I know that the corresponding choice in 1D is equivalent to Bezier spline curve fitting. Claude proposed the Matern kernel with parameter nu=1 and was able to argue why this is the right choice based on its Fourier transform. I had lengthy discussions with two other models (Mistral and GPT4o mini) that claimed nu=3 would be correct but eventually had just made a simple math mistake, confusing themselves.

What happened afterwards was a really deep discussion: I asked Claude about the behaviour in higher d, and it turned out that starting from d=4 there is no sensibly defined Gaussian process with that prior.

1/2

#AI#chatbots#Claude