You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
+++
title: First Post
date: 2019/01/15
published: true
tags: general
+++
I initially coded this blog during 2018 as a way to learn some vuejs javascript coding and to create a page which didn't rely upon any php. I like using php but for such a small site it causes a lot of needless overhead and I liked the idea of having a site which is hosted on a public gist on github and the site fetches the entire site including all pages and posts as an asynchronous call to the github api and then parses all of the files into an array of pages and posts stored in a Vuex store. This makes switching between pages very quick because everything is already loaded from the first call.
I have no idea how this will work when the number of posts increases but for now it is quite snappy and I'll fix it later if it comes to that.
The basic skeleton for the site is generated using vue-cli 3 which is a quick and easy way to spin up a new site and takes care of the webpack settings. I have gone through the process of creating a vuejs site from scratch including setting up the webpack configurations but this just takes a lot of time and head scratching and is quite frankly not something I like doing. vue-cli makes that process super simple. It also includes code linting built in which forces me think harder about what I write.
I have also added in a deploy.sh shell script which I can run locally and it runs the production build of the site, uploads it to the website, and there's a post-update hook on the server which then does whatever is needed on the server which saves me a lot of time that I used to spend either using ftp or ssh to pull in the changes on the server.
I have plans to make some posts on here when it suits me. For example I might put the details of the deploy.sh file I described above.
I have also provided some background to various small websites that I've developed on this server and these are described in the Sites link on the navbar.
This site currently accepts only pages and posts but I am thinking of adding some way of parsing/presenting photos as another file time in the gist which contains the public side of the site.
+++
title: Update on Euro 2020 website - WYSIWYG
date: 2020/03/12
published: true
tags: javascript, vuejs, quilljs, laravel
+++
I had initially been using the Trix Editor editor with the vue-trix Vue wrapper as the WYSIWYG editor on the forum for the site. This worked reasonably well but I wasn't overly happy with the look of the editor and also the implementation of images, tweets, etc. was a bit messy with lots of backend and frontend parsing of special codes like [tweet](12345) required to show the tweet with an id of 12345. This would then be rendered once the editor was closed. This would have required some instructions for users and I could see it being a hassle down the line.
I did some looking around for different editors and stumbled across the Quill editor. This had a cleaner look in my opinion but also allowed for (relatively) easy extensions through modules and formats. I could also remove a lot of the clutter I previously had in the backend for sanitising and storing the post.
The new forum editor incorporates several modules and formats:
quill-image-resize-module: This modules lets the post author resize an uploaded image and also adjust the alignment of the image. (The actual module had not been updated in 3 years and was giving all sorts of headaches to my webpack / npm scripts and so I cloned that repository and made some modifications to bring it up to date.)
quill-image-drop-module: This module allows for images to be dragged and dropped into the editor as an option instead of clicking the "image" button in the toolbar.
quill-magic-url: This module recognizes links as they are typed and automatically inserts the hyperlink tags automatically. I have actually used this module as a starting point and added in an additional set of regular expressions to detect image urls (either pasted or taped) and inserts the image in place of the link.
format/twitter: This was bodged together from several other ones that I found. It takes a twitter URL and inserts a div into the editor with a ql-tweet class with a data-id set to the tweet ID and data-url to the url. It can then run the official twitter.js createTweet() function to insert the tweet inside that div. This rendering only exists in the editor view and when the data is submitted to the database it only stores the div tag with the id and url. When the editor is closed the tweets can be rendered by searching for all .ql-tweet classes in the vue component and rendering them.
twiter paste: Based on the quill-magic-url module above I modified the quill.clipboard.addMatcher() function to recognize either tweets submitted by their url ie: https://www.twitter.com/1 or by the embed block generated by twitter when you click on "Embed this Tweet" on their website. If the pasted text contains a tweet then I capture that portion of the pasted text, replace it with a ql-tweet div block as described in the twitter format note above.
giphy: Since people like to stick animated gifs in their messages I decided to add in a search box for the giphy.com API. It starts with an icon added to the editor toolbar that, when clicked, fires a Vue event causing. The Giphy component listens for that event and appears on the right side of the page when triggered. Typing a search term in the box and clicking "search giphy" returns the first 10 giphs from the API. Clicking on the desired gif kicks in a javascript function to store that img tag into the browser clipboard and then it can be pasted into the editor. This definitely works on Chrome and Safari but I haven't tested yet on other browsers so that functionality may be disabled in some browsers if necessary.
These changes make the forum experience a bit more straightforward in my opinion and removes the need for markdown or special formatting codes. The proof will be in the pudding come June when the site launches.
+++
title: Update on Euro 2020 website (Covid-19)
date: 2020/10/22
published: true
tags: euro2020, coronavirus
+++
So obviously the European cup was cancelled for 2020 and will (presumably) happen in 2021. I had the test site up and running with automatic updates for the EPL and everything seemed to be going really well with scores being updated every minute when matches were live, predictions being checked against scores for both the "live standings" as well as the "official standings" and then all matches were postponed. My scripts couldn't really cope with every future match going to postponed because of the way I was structuring the database - that was mostly to do with accommodating the EPL which has 38 matchdays instead of a Euro or WC which would have 5 or 6 rounds.
So everything was shuttered for a bit but I have been working on updating the code offline for a bit and will resume steady testing again in the new year as we get closer to the proposed kick off date for the Euro 2021 tournament. Things appear to be working with the 2020/2021 EPL season and I am updating some styling and background coding in the meantime.
+++
title: Update on Euro 2020 website
date: 2019/11/05
published: true
tags: javascript, vuejs, pusher, laravel, excel
+++
In order to get ahead of the rush for next year's UEFA Championships I have put in some upfront work on the website and work through some of the todo items from the last pool. A short update on some of these items below.
Automatic pool set up:
Adding in all of the matches, dates, teams, etc. was always a bit of a slog and a frustrating part of the pool. In the first version of the pools that I put online with a website in 2006 the predictions page had the input forms for each group and also a table of the group results which would update automatically as you entered predictions. With 48 matches in that page and each match built into a table with ~ 8 columns plus the 8 group tables with the associated javascript to update everything this was several thousand lines of code.
Updating the table in 2008 involved combing through that sheet and replacing the dates, home teams, away teams, etc. all manually.
This method repeated until around 2014 or 2016 when I started to put the teams, match times, etc. in a series of arrays at the top of the file and then the tables were generated with for loops. This greatly reduced the size of the file and also the work required to update the pool.
In 2016 this was switched to a custom wordpress addin and the teams were added to the database with a rudimentary admin backend.
In 2018 the pool moved to a laravel site and the teams, matches, etc. were given dedicated tables which could be modified in the backend. This still involved manually entering the match information.
The updated site has a console command pool:setup which will automatically pull all of the match info from either the uefa or thescore website. There is a config file in which the choice of source website is selected, the tournament id is given (for example if using uefa then competitionId is 3 for European Cup, 1 is for Champion's League, if using thescore then league is epl for English Premier League, etc.). The command line function takes in those configuration parameters and pulls all matches for the given tournament and adds everything including team flags to the database.
This addition means that setting up future pools is much easier. As an example there is a test site running on the site right now that has the full 2019-2020 premier league season built in and I have another test site with the 2019-2020 champion's league season.
Live updating of Scores:
This is something that I had been intending for some time but the actual approach was never clear to me. I have finally solved this issue with the process described below:
a. All matches are stored in the database in two tables: Matches and LiveMatches. Matches has scores of null until the match has been finished. All users have a Scores table which has the official scores, and a LiveScores which has the live / in-progress scores.
b. The matches entries in the tables have a status column that can be set to SCHEDULED, UPCOMING, LIVE, or FINISHED. There is a chron job which runs every minute on the server and checks if any SCHEDULED or UPCOMING matches have a start time coming up. If there are no matches set to start within 45 minutes of now then nothing happens. If a match is scheduled to start within 45 minutes of now then it triggers the pool:update command.
c. pool:update starts by switching a match from UPCOMING to SCHEDULED 30 minutes prior to kick off. At this point the lineup is added to the database by querying the selected website and an event is sent to pusher.com to add this match automatically to the LIVE page of the website. More on this later.
d. While there are matches set to SCHEDULED pool:update continues to run every minute and the current match details are scraped from the internet. Once the match changes to LIVE then another event is sent to pusher.com and the score is set to 0-0 in the LiveMatch table.
e. While the match is LIVE, the score is queried every minute as well as all match events (substitutions, corners, goals, etc.) which are stored in a MatchEvents table. On the Live page of the website the events are grabbed from this table every minute. Whenever there is a goal scored, the score in the LiveMatch table is updated which triggers an event to update the LiveScores table which are displayed on the Live page of the website. This sends an event to pusher.com which tells the website to refresh with the latest LiveScores.
f. Once the match is finished, the match status is updated in the table to FINISHED and the final score is transferred from the LiveMatches table to the Matches table. At this point an event is triggered which updates all user scores.
The end result of all of this is that it will be possible to have the Live page of the website open during match play and track what the pool score will look like as the goals are scored.
Added websockets with pusher.com
I mentioned pusher.com a few times above. This is a website which provides websockets integration for other websites. This allows for realtime communication between the back end of the euro site and any users currently viewing the website.
With this integration, when you open the euro website you establish a direct connection to the euro pool channel on pusher.com. If something happens at the back end of the site a message can be sent to all open connections. As an example, when a match changes from UPCOMING to SCHEDULED a message is sent out to everyone who has the Live page on the website open. This message tells the page that a certain match is now being tracked and the javascript on the page knows to query the database to find that match and add the details for you to see. Similarly whenever a goal is scored this is sent immediately to all open pages.
This avoids the need to have a refresh timer built into the site which checks the score every few minutes and instead only gets the goal when one has been scored.
pusher.com allows for 100 maximum concurrent connections and 200,000 messages per day in the free account which should be adequate for this pool.
As I mentioned above, this has been running without any input from me for the past 10 weeks and is automatically updating all matches in the EPL for this season - opening and closing predictions windows, allowing viewing of results after the predictions deadlines, updating all lineups, scores, match postponements, etc. There will be some additional testing required as the actual European Championships approach but for now I am pleased with the updates.

I have also overhauled the forum for the site since the one I was using before was no longer supported. This will require some additional tweaking but for now is a fully functional discussion forum.
The excel file which I use as a backup for the pool has also been fully overhauled. This is more of an exercise in Excel nerdiness playing with lots of indirect cell references, conditional formatting, automatically hiding empty rows, etc. Essentially there are "input" sheets for each round in which the CSV data taken from the website is copy and pasted. These sheets are hidden from normal view. The only other required inputs are the scores of each match which are added to the results page. Everything else is calculated and updated automatically without macros and displayed in a nice format in the summary page and each entry is given a ranking based on total score. An additional ranking sheet gives the ultimate sorted rankings for the pool - which should match the website...
+++
title: Plows site - adding street by street details
date: 2019/01/21
published: true
tags: javascript, leafletjs, turfjs, openstreetmap, mysql, laravel
+++
The first version of the plow site I have been working on contained only a full map of St. John's and would show all plow pings captured during a given time period - up to 24 hours. The 24 hour time period was added because during a stormy period the number of points to be rendered can be very large and cause the browser to hang. At 24 hours it seemed to work reasonably well (though I did remove the mouseover tooltips which are present up to 12 hour durations).
The next thing I wanted to add was an option to query the database for any plow pings that were received in a given street during a selected time period. This required the following:
A list of streets in the city
This was obtained from the city of St. John's website which has, in some locations, a dropdown list of all streets - for example when looking up which ward you are in or when you garbage collection occurs. This list of streets was copied into streetnames.js file as an array
module.exports=["18TH ST","ABBI RD",
...
]
which can be imported into the vue component as
varstreetNames=require('../streetNames.js')
This is then added to the component template using a v-for statement.
The actual geometry of the street.
This is obtained via a call to https://nominatim.openstreetmap.org/ which is the search engine for OpenStreetMap data. This is a limited availability public API but I assume my site will have such low traffic that it should be ok. Otherwise I will have to set up a database of streets.
The responses from the query give a list of points by latitude and longitude for the given street (if available in the openstreetmap database) which I can work with on the site backend and front end.
There are several issues with this method:
A Some streets in St. John's are not currently in the openstreetmap database it seems - for example 18th Street and Abbi Road which are the two I showed above. It actually seems that Abbi Road doesn't exist though it does exist on the St. John's website.
B The geometry of some streets is patchy and for a straight road the polyline can be generated by giving start and end points but if these points are far apart then it means you need a large search radius to get all plow pings along that road but that can also spread the search to adjacent parallel streets.
I'm currently OK living with these flaws for now because the work required to get detailed nodes along each street in the city is not worth the reward.
Querying the database:
The challenge is to look through all of the points in the database and check if they are within a given polygon area that represents the road. Since the street definitions obtained in step 2 above are a series of points along the road the easiest way to do this is to compute the distance between every point in the database with every point along the street and grab any which have a distance less than a given search radius.
The distance between two points by their latitude and longitude can be calculated using the Haversine Formula and in my case I'm using:
6365 here is an assumed radius of Earth at St. John's based on a latitude of 52.7179 (source).
In order to achieve this in SQL the formula can be rewritten as:
SELECT id, type, latitude, longitude, created_at, (6371* acos(radians(street_point_latitude)) * cos(radians(latitude)) * cos(radians(longitude) - radians(street_point_longitude)) + sin(radians(street_point_latitude)) * sin(radians(latitude))) as distance from plow_pings having distance <0.03
Note that the above assumes a search radius of 0.03 and the street point latitude and longitude have to be added to the search individually.
For a long street this was leading to many sql queries so I modified the final query in php using the Laravel query builder and adding multiple distance checks into one query. The final result looks like this:
$lat_lng_points here is the array of street coordinates from Step 2 and for each of these a different distance is calculated. So the query takes each item in the database and compares all distances against the $radius which is selected on the front end.
This step is carried out as an axios call to the backend and the resulting collection of plow pings is sent back to the vuejs component in the response.
Presenting the resuls on the map.
The response back from the database in Step 3 is taken by the vuejs component and three things are updated on the page:
A. Draw the plow pings on the map.
All plow pings are drawn as circleMarkers which are available in the leafletjs library which is used to draw the map generally. This is a simple forEach loop on the array of pings returned and color is added based on the type of vehicle.
Each marker is added to an array with it's id to allow updating of the marker in point C below.
(Note that the created_at timestamp has to be set to St. John's timezone as the database stores the timestamp in UTC (Coordinated Universal Time) which is the same as GMT)
B. Indicate the search region
The array of street points is converted to an array of circles with radius equal to the search radius chosen. This is done using the leaflet-geodesy library and then these circles are combined using the union function of the turfjs library.
The map is then zoomed and re-centered to fit the bounds of the search area.
This is a straightforward v-for loop to display all the pings (date and vehicle type) in a card on the right hand of the page. Because each marker was assigned to an array based on their id the list of pings has a mouseover / mouseout action to identify the selected ping on the map by changing the size of the displayed circle.
The resulting output is shown in the image below for Atlantic Ave on the 15th of January 2019. The highlighted circle shows a plow passing at 2:26pm. This plow is shown in the video below with the same timestamp.

Backend: A public gist on Github accessed through the Github API which is parsed as into a Vuex store. All files are stored in the gist as either name.page.md or name.post.md for pages and posts respectively. Meta tags such as title, date, description are added at the top of files and parsed out when the gist is loaded into the site.
Frontend: A Vuejs 2.5 single page app with the following additions:
Vuex to store the information grabbed from the gist and make it available to each vue component,
axios to make the call to the Github API and fetch the data in the gist.
vue-meta to parse out the information at the top of each page into the appropriate meta fields and dynamically update things like the page title.
showdownjs is a markdown parser/converter to convert *italics* into italics or **bold text** to bold text. I have also written some small extensions to allow me to insert inline youtube video and twitter tweets - so inserting @ tweet(https://twitter.com/8bitfootball/status/1008450650936659974) will be replaced with the tweet in parentheses.
moment provides easy date and time manipulation which I've incorporated into Vue filters to present the published date of a post in several different formats.
Backend: Laravel 5.7 running a scheduled task to query and store the position of all active plows from the City of St. John's AVL site.
Frontend: A Vuejs 2.5 component with three parts:
Loading overlay to show that something is happening when trying to render the several thousand points
Options pane to choose from:
date, time, and duration of to be rendered
type of vehicles to display
type of map - points or heatmap
Map pane showing either satellite or topographic map of St. John's with the plows shown.

Javascript Packages used:
axios: Used for making calls from the vuejs frontend to the laravel backend to update the plow ping data used in the map.
This pool is much the same as the euro 2020 pool. Main differences are the backend api coming from thescore rather than uefa, and vuejs updated from 2.x to 3.x.
Some features
live updating of scores (~ 1-5 minute delay) with a "Live" page which will provide the state of the pool as it stands before the matches have been finalized.
automated updating of the official scores based on the same api service which will provide the live scores.
updated forum - minor tweaks to the Euro 2020 one. Still using Quill as a text editor with some modifications to include giphy, twitter and youtube embeds, etc.
European Championships 2020 Pool
link: coming in early 2020
The basic workings of this pool's page will be an extension of the successful wc2018 page built using Laravel and Vuejs. The intention is to improve upon the design with some additional items such as:
multi-tenancy to allow for more than one pool to be run concurrently on the same site. (not being impletemented)
live updating of scores (~ 1-5 minute delay) with an "in progress scores" page which will provide the state of the pool as it stands before the matches have been finalized. (undergoing testing but but appears to be working)
automated updating of the official scores based on the same api service which will provide the live scores. (undergoing testing but appears to be working)
updated forum - possibly. I may re-use the chatter forum used before since it is fit for purpose but I will look into some alternatives as well. (rebuilt the forum completely to move away from chatter which was not being updated anymore)
Behind these tables are tables for 'matches' which generates sets up the predictions pages by looping over the Group field and using team names and stadiums from the 'teams', 'stadiums' tables.
The matches table is carried out in an attempt to automate much of the site generation. In the very early pools I ran the predictions table php file would be several hundreds to thousands of lines as each match was filled in manually by hand. This has been reduced to under 50 lines with foreach loops in the top level blade file and the subsequent vuejs component for predictions.
I have tried to perpetuate these efficiencies throughout the front end with increasing use of vuejs components where appropriate.
For the next pool (Euro 2020) I am considering including a multi-tenancy set up to the back end to allow for additional pools to be run concurrently on the same site.
I had also set up social logins (using facebook, twitter, and google) however I chose to remove these after all of the Cambridge Analytica news started surfacing in early 2018. After the tournament was completed I scrubbed the database to remove user emails and names with just the nicknames left behind for record keeping.
Some Packages Used:
devdojo/chatter: Simple discussion board added in for banter during the tournament. In addition to the base level settings I added in additional markdown parsing to display youtube/vimeo videos and tweets inline.
axios: With all prediction submissions being done asynchronously I incorporated axios to make api calls to the laravel backend.
moment: Primarily used to deal with timezones with users signing up from Newfoundland to Australia and everywhere in between. MomentJS also gives some easy humanised formatting for the upcoming deadlines (ie. '3 hours and 5 minutes from now' instead of 11100 seconds)
bulma: The main css framework used instead of bootstrap
In the first web based pool that I did in 2006 I made my own log in system and this had some security flaws so for the 2008 - 2014 versions of the pools I used the open source forum SMF as the basic backend. This forum provided a secure registration and log in system and had the added bonus of providing a forum for posting messages during the tournament. On top of the SMF back end I had shoehorned in my pool scripts - predictions submission tables, results viewer, standings page, and the admin backend pages which carry out the calculations.
This system had been working fine but I eventually grew tired of the SMF system. For the 2016 pool I set up a wordpress site and developed a plugin to incorporate the pool specific pages. This gave me quite a lot of additional freedom as Wordpress allows for full freedom in terms of styling the frontend and accessing the backend through their api system. The forum system was implemented by allowing users to make and comment on posts.
I generally liked the Wordpress system for this version of the pool but I started learning Laravel at around this time and ultimately switched the pool to a Laravel site after this tournament.
I quickly created this website for Janice to use at the gym where she works. They were doing a "Transformation Challenge" over the course of 8 weeks between 20/01/2020 and 15/03/2020.
Members at the gym needed a place to record daily updates on things like:
how well they felt that day,
water consumption,
hours of sleep,
whether or not they had done a workout that day,
daily weight, and
a free place space to record any comments about the day.
On the other side of the site the Coaches had to be able to monitor the members assigned to them to see at a glance how many workouts they had completed, their weight change over the course of the challenge, and also whether the member had requested a one-on-one consultation. Additional to the overview dashboard, the coaches could view all daily diary entries submitted by their members and also comment on those entries.
This site was quickly created using Laravel 6 with a log viewer and the intervention/image package as the only composer packages included. The back end is a straightforward CRUD system to store diary entries, comments, etc.
I introduced an "invitation" system so that only members who were invited by a coach could register for the site. This involved a database table of invitations with a unique invitation token for each invite. The registration page requires a valid invitation token blocking all non-valid requests.
User profiles have the option of adding a profile image which was done using a combination of the intervention/image php package on the back end and the croppie javascript library on the front end to crop the uploaded image.
Most front end elements are a combination of laravel blade and vuejs components with interaction between components being carried out using a vuejs Event bus.
+++
title: World Cup 2022 Pool website
date: 2022/11/16
published: true
tags: javascript, vuejs, pusher, laravel, excel
+++
The World Cup 2022 Pool site is up and running now at https://wc2022.andreasgeorghiou.com. Not a lot of major updates to the pool since the Euro 2021 pool - the major change is that I wasn't sure whether UEFA would offer all of the matches in their API - so far they are only showing the first 48 matches which only gets to the end of the Group Phase and I need to have matches for the whole tournament. I have shifted the back end api for this pool to TheScore which does have placeholders for those later round matches.
As with the Eurupean Pool the site has:
Automatic pool set up:
The site has a console command pool:setup which will automatically pull all of the match info from either the thescore website. The command line function takes in configuration parameters like leagueId, dateFrom, dateTo, etc. and pulls all matches for the given tournament and adds everything including team flags to the database. I didn't like the flags from thescore so I have overwritten those manually, similarly the stadium images were scraped from the main tournament website.
Live updating of Scores:
This was first introduced in the Euro 2020 pool with the process described below:
a. All matches are stored in the database in two tables: Matches and LiveMatches. Matches have scores of null until the match has been finished. All users have a Scores table which has the official scores, and a LiveScores which has the live / in-progress scores.
b. The matches entries in the tables have a status column that can be set to SCHEDULED, UPCOMING, LIVE, or FINISHED. There is a chron job which runs every minute on the server and checks if any SCHEDULED or UPCOMING matches have a start time coming up. If there are no matches set to start within 45 minutes of now then nothing happens. If a match is scheduled to start within 45 minutes of now then it triggers the pool:update command.
c. pool:update starts by switching a match from UPCOMING to SCHEDULED 30 minutes prior to kick off. At this point the lineup is added to the database by querying the selected website and an event is sent to pusher.com to add this match automatically to the LIVE page of the website. More on this later.
d. While there are matches set to SCHEDULED pool:update continues to run every minute and the current match details are scraped from the internet. Once the match changes to LIVE then another event is sent to pusher.com and the score is set to 0-0 in the LiveMatch table.
e. While the match is LIVE, the score is queried every minute as well as all match events (substitutions, corners, goals, etc.) which are stored in a MatchEvents table. On the Live page of the website the events are grabbed from this table every minute. Whenever there is a goal scored, the score in the LiveMatch table is updated which triggers an event to update the LiveScores table which are displayed on the Live page of the website. This sends an event to pusher.com which tells the website to refresh with the latest LiveScores.
f. Once the match is finished, the match status is updated in the table to FINISHED and the final score is transferred from the LiveMatches table to the Matches table. At this point an event is triggered which updates all user scores.
The end result of all of this is that it will be possible to have the Live page of the website open during match play and track what the pool score will look like as the goals are scored.
Added websockets with pusher.com
I mentioned pusher.com a few times above. This is a website which provides websockets integration for other websites. This allows for realtime communication between the back end of the wc2022 site and any users currently viewing the website.
With this integration, when you open the wc2022 website you establish a direct connection to the wc2022 pool channel on pusher.com. If something happens at the back end of the site a message can be sent to all open connections. As an example, when a match changes from UPCOMING to SCHEDULED a message is sent out to everyone who has the Live page on the website open. This message tells the page that a certain match is now being tracked and the javascript on the page knows to query the database to find that match and add the details for you to see. Similarly whenever a goal is scored this is sent immediately to all open pages.
This avoids the need to have a refresh timer built into the site which checks the score every few minutes and instead only gets the goal when one has been scored.
pusher.com allows for 100 maximum concurrent connections and 200,000 messages per day in the free account which should be adequate for this pool.
This worked well during the European Championships pool and I am hopeful everything will run smoothly this time even with the different match update api source.
The discussion forum took some tweaking between July 2021 and Nov 2022. Mostly this was due to my changing from vuejs 2.0 to 3.0 which caused all sorts of breaks in my code that had to be fixed. I think things are working well at the moment.
As with all pools I have an offline excel file which I use as a backup for the pool. This is more of an exercise in Excel nerdiness playing with lots of indirect cell references, conditional formatting, automatically hiding empty rows, etc. Essentially there are "input" sheets for each round in which the CSV data taken from the website is copy and pasted. These sheets are hidden from normal view. The only other required inputs are the scores of each match which are added to the results page. Everything else is calculated and updated automatically without macros and displayed in a nice format in the summary page and each entry is given a ranking based on total score. An additional ranking sheet gives the ultimate sorted rankings for the pool - which should match the website if my algorithms work as expected.