lundi 27 avril 2015

How to handle concurrent requests that delete and create the same rows?

I have a table that looks like the following:

game_stats table:

id | game_id | player_id | stats | (many other cols...)
----------------------
1  | 'game_abc' | 8 | 'R R A B S' | ...
2  | 'game_abc' | 9 | 'S B A S' | ...

A user uploads data for a given game in bulk, submitting both players' data at once. For example:

"game": {
  id: 'game_abc',
  player_stats: {
    8: {
      stats: 'R R A B S'
    },
    9: {
      stats: 'S B A S'
    }
  }
}

Submitting this to my server should result in the first table.

Instead of updating the existing rows when the same data is submitted again (with revisions, for example) what I do in my controller is first delete all existing rows in the game_stats table that have the given game_id:

class GameStatController
    def update
      GameStat.where("game_id = ?", game_id).destroy_all
      params[:game][:player_stats].each do |stats|
        game_stat.save
      end
    end
end

This works fine with a single threaded or single process server. The problem is that I'm running Unicorn, which is a multi-process server. If two requests come in at the same time, I get a race condition:

Request 1: GameStat.where(...).destroy_all
Request 2: GameStat.where(...).destroy_all
Request 1: Save new game_stats
Request 2: Save new game_stats

Result: Multiple game_stat rows with the same data.

I believe somehow locking the rows or table is the way to go to prevent multiple updates at the same time - but I can't figure out how to do it. Combining with a transaction seems the right thing to do, but I don't really understand why.

EDIT

To clarify why I can't figure out how to use locking: I can't lock a single row at a time, since the row is simply deleted and not modified.

Aucun commentaire:

Enregistrer un commentaire