Table of contents
Company APIThe JSON layout for a company is as follows:
{
"name": <string>,
"address": <string>,
"city": <string>
"country": <string>
"phone": <string>
"cvr": <integer>
}
| Path | Description |
|---|---|
GET /api/companies |
Returns a list of all companies in database, each containing id, name and cvr. |
POST /api/companies |
Adds a new company to the database. The company CVR is checked to make sure it contains 8 digits. No modulo 11 check is performed. |
GET /api/companies/:id |
Returns a JSON object of company with id :id, or 404 |
DELETE /api/companies/:id |
Deletes company with id :id from the database |
The following command creates a folder companyindex that contains a barebones Padrino project.
padrino g project companyindex\
-d activerecord\
-a postgres\
-s jquery\
-e haml\
-c sass\
-t minitest\
-i
More information, including descriptions of the flags can be found here.
In the companyindex folder, the command bundle install --path vendor/bundle installs the packages mentioned in the Gemfile.
Create the database with bundle exec padrino rake ar:create. This requires that your postgres database is set up appropriately.
Reference here.
bundle exec padrino g model\
Company\
name:string\
address:string\
city:string\
country:string\
phone:string\
cvr:integer
Database migration bug
Running bundle exec padrino rake ar:migrate fails with the error:
Directly inheriting from ActiveRecord::Migration is not supported.
Please specify the Rails release the migration was written for:
class CreateCompanies < ActiveRecord::Migration[4.2]
I am unsure as to why this hasn’t been fixed. But, solution is to modify
db/001_create_companies.rb and as suggested in the error, specifying the version of activerecord used.
In 001_create_companies.rb change class CreateCompanies < ActiveRecord::Migration to class CreateCompanies < ActiveRecord::Migration[5.1].
Of course [5.1] (probably) needs to correspond to your version of activerecord.
To add a unique index to the CVR column, we need to change the file db/migrate/001_create_companies.rb to look something like the following.
class CreateCompanies < ActiveRecord::Migration[5.1]
def self.up
create_table :companies do |t|
.
.
.
end
# This line creates a unique index for the CVR number.
add_index "companies", "cvr", unique: true
end
.
.
.
end
Generate the API app with bundle exec padrino g app api.
Add some skeleton functions to the API with the following command:
bundle exec padrino g controller companies\
get:index delete:companies post:companies\
--app api
The controller is now in api/controllers/companies.rb.
First off, the API is located in api/controllers/companies.rb. Validation for the api is done in models/company.rb.
The application uses Sprockets in the base app, which means assets go into app/assets. The public folder only serves static files.
The HTML is set up with templates, beginning with app/views/layouts/app.haml.
The rest are templates in the app/views/ folder. The hierarchy is as follows.
yui-compresser and uglifier, however the first required Java on the dyno, while the second could not minify due to an error.CVR value is only valid if it contains 8 digits, but no modulo 11 check is performed. Though it would be easy to implement.The usual approach would be that each “customer” is issued a unique token, which lets them perform API calls.
One could also generate certificate/private-key, and use client-certificates to verify the identity of the caller. It would most likely be better if they generated them themselves.
That depends on the level of redundancy and update policy.
My main idea would be to create a second table to contain versioned changes. The table would contain createdAt, replacedAt and CVR foreign key to describe when it was first put in use and when it was replaced. The data I would most likely choose to store in JSON/XML format, since this would handle future updates to the main table without having to change versioning.
I imagine versioning would only be interesting historically. If it is expected to have many queries that do not correspond to CVR, I would choose replace the JSON with similar columns as the main table.
These days I would likely look at ElasticSearch to see what their appliance does.
If I were to implement it, however, that would depend on the type of search required. Simple queries could be handled by building a suffix-tree over the search keys, for massive amounts of data. However, database queries are also sufficient depending on the amount of queries expected.
For hamming-distance searches, the only way would be brute-force solution through all possible entries.
Maybe there are some tricks to regular expression search, I am not sure however.