Sonar is an open source web application for analyzing and measuring software code quality, offering visual reporting and metrics with historical information aggregation across numerous data collection points. Analysis of the Java language is provided by default and support for other languages is available through commercial and open source plugins.
This article describes the process for installing and integrating Sonar with a PostgreSQL database.
Configuration has been performed using Sonar 2.13.1 in standalone mode (Jetty servlet container) and PostgreSQL 22.214.171.124 on Windows.
Update – 9 November 2012
The installation steps described can also be followed for installing Sonar 3 (note: there may be a difference in the number of database tables created).
Create Sonar database user
Create the sonar user:
CREATE USER sonar WITH PASSWORD ‘s0me1nsecurepassw0rd’;
postgres=# SELECT usename, usesysid FROM PG_USER; usename | usesysid ----------+---------- postgres | 10 sonar | 16393 (2 rows)
Create Sonar database
Sonar requires a database with UTF8 character set. Create the database as follows:
CREATE DATABASE sonar WITH OWNER sonar ENCODING ‘UTF8’;
postgres=# SELECT datname, datdba, encoding FROM pg_database; datname | datdba | encoding -----------+--------+---------- template1 | 10 | 6 template0 | 10 | 6 postgres | 10 | 6 sonar | 16393 | 6 (4 rows)
Configure JDBC for Sonar
conf/sonar.properties altering properties to use the newly created database. The tables will be automatically created when Sonar starts.
Start Sonar in standalone mode.
Validate Sonar tables
Log into the sonar database:
psql -d sonar -h localhost -U sonar
Validate that 47 sonar tables exist:
sonar=> SELECT table_name FROM information_schema.tables WHERE table_schema = 'public'; table_name --------------------------- schema_migrations projects metrics rules_categories rules_parameters project_links rules rule_failures project_measures users active_rule_parameters snapshot_sources active_rules alerts rules_profiles events properties groups groups_users group_roles user_roles dependencies measure_data filters filter_columns criteria active_filters quality_models characteristic_edges characteristics characteristic_properties active_dashboards dashboards widgets widget_properties review_comments reviews active_rule_param_changes active_rule_changes manual_measures notifications duplications_index snapshots action_plans loaded_templates resource_index action_plans_reviews (47 rows)
Browse to Sonar
Access the Sonar application at localhost on the default port at http://localhost:9000. You will see an empty project list ready for population with metrics generated from your projects.
Scenarios such as testing or baseline deployments require populating database tables with seed data. In PostgreSQL there are a number of approaches to load data including use of SQL scripts or database dumps, however there is a simple alternative which is to load from a CSV file. This approach is most likely to be used where non-technical people are involved in preparing functional tests.
For the purpose of this trivial example, a table containing bids registered by traders is being populated. This table has no foreign key relationships. Create the
bid table with the following DDL.
CREATE TABLE bid ( bid_id bigint NOT NULL, trader_id character varying(10) NOT NULL, bid numeric, CONSTRAINT bid_pkey PRIMARY KEY (bid_id) ) WITH ( OIDS=FALSE ); ALTER TABLE bid OWNER TO postgres;
CSV Data File
Create a CSV file containing the bid data with the first row containing column headers.
ID,Trader,Bid 1,AA14124,0.54615104 2,AA14125,0.594037188 3,AA14126,0.884697673 4,AA14127,0.55843941 5,AA14128,0.90518401 6,AA14129,0.977280122 7,AA14130,0.277475746 8,AA14131,0.559351948 9,AA14132,0.261013363 10,AA14133,0.236723671 etc... etc...
Process CSV Data File
bid-data.csv exists in the
/Temp directory, load the CSV into PostgreSQL by executing the following command in the pgadmin query tool:
copy bid from '/Temp/bid-data.csv' DELIMITERS ',' CSV HEADER
If the load is successful you will see a message similar to the following displaying the number of rows processed:
Query returned successfully: 2500 rows affected, 182 ms execution time.