Automatically fix Github Actions failures with Copilot coding agent [GitHub Copilot Pro]

TLDR: In this blog, we’ll show you how to automate GitHub Actions failure handling using GitHub Copilot’s coding agent to create issues and fix broken builds automatically. See It In Action First Here’s what the complete workflow looks like when your CI fails: 🔒 Security First: This workflow gives AI access to your repository. We’ll configure it to only access selected repos and disable data collection. See full security details. The result: Your CI fails at 9 AM, Copilot creates a fix by 9:05 AM, you review and merge from your phone during your coffee break. What we will implement • When your CI fails, a workflow automatically creates a detailed issue with failure context and assigns it to Copilot’s coding agent. ...

Setup free webscraping in less than 5 minutes using Github Actions

TLDR: Create a new Github repository and add a Workflow file containing a schedule and a curl statement which downloads a JSON file from a API endpoint on cron schedule into the repository. Visualize the results using Flat-viewer. See below for a more detailed step-by-step guide. on: push: workflow_dispatch: schedule: - cron: '6,26,46 * * * *' # every twenty minutes jobs: scheduled: runs-on: ubuntu-latest steps: - name: Check out this repo uses: actions/checkout@v2 - name: Fetch latest data from the API endpoint run: |- curl -s "https://www.nu.nl/block/lean_json/articlelist?limit=20&offset=0&source=latest&filter=site" | jq '.data.context.articles' > headlines.json - name: Commit and push if the data has changed run: |- git config user.name "Automated" git config user.email "actions@users.noreply.github.com" git add -A timestamp=$(date -u) git commit -m "Latest data: ${timestamp}" || exit 0 git push Git scraping This guide is inspired by the excellent Simon Willison’s “Git Scraping” concept https://simonwillison.net/2021/Mar/5/git-scraping/ which combines the free compute in Github Actions with storing flat-files (e.g. json) in a Git repository. ...