Skip to content
Snippets Groups Projects
Commit 94103ebe authored by Jonathan Hargreaves's avatar Jonathan Hargreaves
Browse files

Merge branch 'master' of git.astron.nl:desp/hdl

Needed before changes can be pushed
parents 9c5728f8 c2393629
No related branches found
No related tags found
No related merge requests found
* GIT workflow
* Confluence
* Polarion
* Latex
*******************************************************************************
* GIT workflow
*******************************************************************************
difftool ?
mergetool ?
* Pro Git book by Scott Chacon: https://git-scm.com/book/en/v2
* YouTube : David Mahler part 1,2,3
Part 1:
# After GIT install
git version
git config --global http://user.name "EricKooistra"
git config --global http://user.email "erkooi@gmail.com"
git config --list
touch .gitignore # create .gitignore if it does not already exist
.gitignore # file with working tree dirs and files to ignore, must also be commited
# To start a repo
cd ~/git
git init # start new repo at this dir, creates .git/
git clone # get and start with existing repo
git clone git@git.astron.nl:desp/args.git
git status # what is in stage area and what is modified
Three areas:
* working tree # local directory tree
| git add
v
* staging area (index)
|
v git commit
* history # .git repository with entire commit graph
# To use a repo
git add <dir>/<file> # add to stage area, set for commit. Cannot add empty dir, need empty file in it
git add . # add all new and modified to stageing area
git diff # diff between file in working tree and staging area
git diff --staged # diff between file in staging area and history
git rm <filename> # remove file from working tree and stage the delete
git checkout -- <filename> # revert a working tree change
git reset # clear stage area
git reset HEAD <filename> # revert staged change
git log -- <filename> # show history of file
git checkout <version hash> -- s2 # retrieve file from history into staged area and working tree
Part 2:
git commit -m "" # commit what is in stage area
git commit -a -m "" # add to stage area and commit what is in stage area
alias graph="git log --all --decorate --oneline --graph"
git branch <branch name> # creat branch
git branch # show branches
git checkout <branch name> # change working tree and stage area to branch
git checkout master
git merge <branch name> # Fast forward merge of branch name to master if there is a direct path
# by moving master to branch name, this is when there have been no updates
# on the master branch since the branch was created.
# Three way merge combine the differences of the branch and the master
# compared to their common version, this can lead to merge conflicts if
# changes on both branches occur at same parts of a file.
git branch --merged # show branches that ghave been merged to master
git branch -d <branch name> # remove branch
git checkout <commit hash> # detached HEAD because it points to a version not a branch
git branch <branch name> # start a branch from the commit hash, HEAD is attached again
# Stash area to store working tree
git stash save "comment" # store working tree and stage area to get a clean
git stash list # show all stashes
git stash apply <label> # restore stash
git stash apply # restore last stash
Part 3: Remote repositories (Github, Gitlab, Bitbucket, ...)
create repo on Github
http://README.md # md = mark down
git clone <url:.../<repo name>.git> # get copy from url
cd <repo name>
git config --local http://user.name "EricKooistra"
git config --local http://user.email "erkooi@gmail.com"
git remote # origin
git remote -v # full url
# To align with remote repo
# update from remote
git status # shows also origin/master, but not live
git fetch origin
git status # shows also origin/master, now with latest remote
git merge origin/master
git pull # get latest from remote repo, combines fetch and merge
# upload to remote
git push
git push origin master # put local repo to remote repo
# On Github fork is a copy of the a repo in Github to get a repo on your account
git clone <url of fork> # get copy of fork repo, will be origin
git remote add upstream <url of original repo on Github> # will be upstream
git fetch upstream
git status
# commit local change on branch
# git push origin <branch name> # push to my fork repo on Github
# pull request on Github
# delete branch and fetch npstream if the pull request was accepted
git remote remove <remote name> # remove a remote repo
*******************************************************************************
* Confluence:
*******************************************************************************
- space tools menu links onder om secties the ordenen.
*******************************************************************************
* Polarion:
*******************************************************************************
*******************************************************************************
* LaTeX
*******************************************************************************
- \sigma \sqrt{}
- 4.15 \cdot 10^{15}
- M =
\left[ {begin{array}{cc}
1 & 2 & 3 & 4\\
5 & 6 & 7 & 8\\
\end{array} } \right]
.
\ No newline at end of file
*******************************************************************************
* 1) Article topics
*******************************************************************************
Title: RadioHDL build environment for FPGA firmware
*******************************************************************************
* 2) User guide topics
*******************************************************************************
a) Introduction:
RadioHDL is a highly flexible automated build environment for HDL source code. The HDL is
organized in HDL libraries. The HDL libraries promote code reuse. The top level component
that can run on an FPGA is also a HDL library. The parameters for HDL libraries, the build
tools and target FPGA are kept in configuration files. The configuration files and source
code are the inputs for the RadioHDL tool. The output is a build result that depends on
which build tool is used. The buil result can e.g. be a project file for Modelsim to
simulate the HDL, a project for Quartus to synthesize the HDL, a report log from a
regression test that simulated the HDL.
Technology dependent IP for e.g. PLL, RAM, FIFO, transceivers, DDR4 is included via HDL
libraries as well. The technology dependent IP is pregenerated and instantiated using wrapper
HDL. The wrapper HDL around the technology dependent IP makes the IP vendor agnostic. The
wrapper HDL can select one or more vendor IP. The wrapper IP can also select a behavioral
simulation model of the IP, to speed up the HDL simulation or to simulate without any vendor
dependence.
Features:
- Gear scripting based on Pyhton 3.x
- Configuration files to define the sources and how to build them
- Separation of source files and build result files
- All HDL organised in HDL libraries that
. provide hierarchical structure and promote reuse
. allow separation of technology dependent libraries, board specific libraries, general
libraries and application specific libraries
. can be used to build different revisions of the same source code based on generics
. can include local test benches to verify the library in a regression test
b) Quick start:
c) Config files:
hdllib.cfg :
. Each HDL Library has a local hdllib.cfg configuration file that defines the sources and
supported tools
. many, each local per HDL library in a sub directory of $RADIOHDL_WORK
hdl_buildset_<buildset_name>.cfg
. A central hdl_buildset_<name> build configuration file defines the combination of sources,
FPGA type and version and tool versions that are needed to build target FPGA type (board)
and type and version of the tools for synthesis, simulation.
. defines a combination of board, FPGA and tool versions
. one central per buildset located at $RADIOHDL_CONFIG
hdl_tool_<tool_name>.cfg
. A central hdl_tool_<name>.cfg tool configuration file that defines central setting for
that tool. Typical tools are e.g. Modelsim for simulation and Quartus for synthesis, but
other tool vendors can also be supported and other tools like a reggression test that
runs a set of test benches for a set of HDL libraries.
. defines tool specific settings (e.g. modelsim, quartus)
. one central per tool located at $RADIOHDL_CONFIG
d) Environment setup
* Operating system
RadioHDL supports both Windows and Linux operating systems. The following tools need to be
availabel in order to build target files (TBC):
- Make (available in /bin for win32 platforms)
- Python 3.x
- Pyhton libraries (numpy, pylatex, yaml)
* Environment variables
.bashrc
- ALTERA_DIR is set to where Altera tool version is installed (i.e. Quartus)
- MENTOR_DIR is set to where Mentor tool version is installed (i.e. Modelsim)
- MODELSIM_ALTERA_LIBS_DIR is set to where the compiled Altera tool versions HDL libraries
for simulation with Modelsim are stored
init_<my_project>.sh
- RADIOHDL_WORK is set to location of this init_<my_project>.sh and defines the root
directory from where all HDL libraries source files (hdllib.cfg) can be
found
- RADIOHDL_BUILD_DIR is set to ${RADIOHDL_WORK}/build and defines where all build results will
be put
- HDL_IOFILE_SIM_DIR is set to ${RADIOHDL_BUILD_DIR}/sim and defines where Modelsim simulation
will keep temporary file IO files.
init_radiohdl.sh
- RADIOHDL_GEAR is set to location of this init_radiohdl.sh and defines the root directory
of where the RadioHDL tool is installed.
- RADIOHDL_CONFIG is set to ${RADIOHDL_GEAR}/config if not already defined by the user and
defines where the central configuration scripts for buildsets (unb1, unb2b)
and tools (modelsim, quartus) are kept.
Environment variables for build tools (e.g. modelsim, quartus) are set automatically (TBC)
MODEL_TECH_DIR
* Environment files
Altera hdl_user_components.ipx:
This hdl_user_components.ipx defines where Quartus QSYS searches for user components. The
init_<my_project>.sh copies hdl_user_components.ipx from $RADIOHDL_WORK to
${ALTERA_DIR}/ip/altera/user_components.ipx.
f) Directory structure
RadioHDL requires that:
- the HDL source code is organised in one or more HDL libries
- each HDL library has a dedciated directory and hdllib.cfg file
- all HDL library directories can be found from a common root directory
Hence a single VHDL file in a single HDL library is enough to get started with RadioHDL. For a
larger firmware design that contains all source code and configuration settings to build an
application that can be synthesized and run on an FPGA a the directory structure can be:
- applications/ -- application specific top level HDL libraries and application specific support
HDL libraries
- boards/ -- board specific HDL top level HDL libraries and board specific support HDL
libraries
- libraries/ -- general HDL libraries
base/ -- common low level functions (FIFO, block RAM, register, multiplexer,
reordering, packetizing, ..)
dsp/ -- digitial processing (FFT, FIR filter, filterbank, beamformer, ...)
io/ -- input/output (DDR, transceivers, Ethernet, flash, ADC, I2C, ...)
external/ -- RTL style HDL code from other parties
technology/ -- Tecnhology wrapper liobraries of vendor IP, vendor IP libraries, behavioral
model IP libraries (clock, memory, IO)
For example the board support package (BSP) firmware of an FPGA on a certain board can be placed
in the boards/<board name>/libraries/bsp/ directory.
Each board is assumed to have a single board support package library that includes all code
required to use the FPGA on the board, this includes e.g. a 1GbE interface for control, a flash
interface for storing the FPGA image.
This BSP can be instantiated into a minimal firmware image that can run on an FPGA and
can be placed in the boards/<board name>/designs/<board_name>_minimal/
A different version of a board will result in a different board_name, escpecially if the new
version of the board contains another FPGA type or version or requires a new tool version.
A more elaborate firmware application that runs on the board will instantiate the BSP and a
combination of application specific code and general components. The application specific
componentts can be grouped in applications/<application_name>/libraries and the application itself
in applications/<application_name>/designs/<design_name>. If the top level entity of the application
has generics then it is possible to build different revisions of the same design source code in
applications/<application_name>/designs/<design_name>/revisions. The boards and applications build
upon general HDL libraries. The directories may be structure with sub directories with at the
lowest level the sub directory of a HDL library that typically uses that library name <lib_name> or
an derivative of it as directory name. The HDL library directories are typically organized by
purpose source (src) or testbench (tb) and by HDL language (vhdl, verilog). However this sub
directories are only for convenience, because the hdllib.cfg can refer to the code from anywhere
within <lib_name>. There is also a doc directory to put any documentation that is relevant for the
library.
<lib_name>/hdllib.cfg
/src/vhdl
/tb/vhdl
/doc
When deciding on how to divide HDL code into libraries the following general rules should be
followed:
- HDL Libraries within the /libraries directories should be single purpose and not associated with
a particular hardware platform.
- Hardware specific functions should be placed into the /boards/<board>/library directory.
- Small modules that will be frequently reused in other libraries or generic wrappers for vendor
specific IP should be placed in /libraries/base/<hdl library>. Higher level libraries should be
placed in /libraries/io/<hdl library> or /libraries/dsp/<hdl library> depending on functionality.
- Externally supported code should be placed in /libraries/external/<hdl library>
- Vendor IP should be placed into /libraries/technology/<hdl library> and grouped according to
function area i.e. memory, fifo, mac_10g.
Technology independence:
The technology/ directory contains several technology dependent libraries of vendor IP that is
pre generated and then wrapped by a technology agnostic wrapper entity. These wrapper entities
are then used by the other libraries.
g) Top level HDL library:
A top level HDL library is defined as a HDL library that contains a top level entity and
configuration parameters to synthesize it to an FPGA image that can run on an FPGA.
A top level HDL library should not be reused in other HDL libraries, to avoid confusion
that can occur due to conflicting or dupplciate
*******************************************************************************
* Run RadioHDL with GIT
> cd ~/git/hdl
> . ./init_hdl.sh # setup development environment for hdl/
# hdl/libraries, hdl/boards and hdl/applications are developed simultaneously and therefor in one git hdl/ repository
# automatically also sources ../radiohdl/init_radiohdl.sh if necessary
> compile_altera_simlibs unb1 # creates build/unb1/hdl_libraries_ip_stratixiv.txt
# creates build/quartus/<tool version> simulation models that need to be moved to /home/software/modelsim_altera_libs
> generate_ip_libs unb1 # creates build/unb1/qmegawiz/
# creates build/unb1/quartus_sh --> empty dir, why is it there?
> quartus_config unb1 # creates build/unb1/quartus/<hdllib libraries> for synthesis
# creates build/unb1/quartus/technology_select_pkg.vhd
> modelsim_config unb1 # creates build/unb1/modelsim/<hdllib libraries> for simulation
# creates build/unb1/modelsim/modelsim_project_files.txt for Modelsim commands.do
# creates build/unb1/modelsim/technology_select_pkg.vhd
> run_qsys unb1 unb1_minimal_qsys
*******************************************************************************
* Run RadioHDL with SVN
echo "Uniboard trunk is selected"
export SVN=${HOME}/svnroot/UniBoard_FP7
#Setup RadioHDL environment for UniBoard2 and and new Uniboard1 applications
. ${SVN}/RadioHDL/trunk/tools/setup_radiohdl.sh
# Support old UniBoard environment (including Aarfaac and Paasar)
. ${SVN}/RadioHDL/trunk/tools/setup_unb.sh
*******************************************************************************
* 3) Programmers guide topics
*******************************************************************************
RadioHDL gear directory structure
$RADIOHDL_GEAR/config # central config files for buildsets and tools
/core # RadioHDL gear scripts
/doc # manuals
/ise # scripts for Xilinx ISE, Impact tools
/modelsim # scripts for Mentor Modelsim tool
/quartus # scripts for Altera Quartus, SOPC, QSYS tools
init_radiohdl.sh # Initialize RadioHDL for a project at $RADIOHDL_WORK
generic.sh # Collection of useful functions
# Scripts to adopt RadioHDL configuration parameters as environment variables
# or paths
set_config_path # expand config paths
set_config_variable # export config variables
set_hdllib_variable # export hdllib variables
*******************************************************************************
* Open issues:
*******************************************************************************
- Support more roots in RADIOHDL_WORK for searching HDL libraries
- Central HDL_IO_FILE_SIM_DIR = build/sim --> Project local sim dir
- avs_eth_coe.vhd per tool version? Because copying avs_eth_coe_<buildset>_hw.tcl to $HDL_BUILD_DIR
copies the last <buildset>, using more than one buildset at a time gices conflicts.
- RadioHDL improvements requested by CSIRO for Vivado
\ No newline at end of file
......@@ -14,6 +14,13 @@ https://en.wikipedia.org/wiki/OPC_Unified_Architecture
- Service oriented architecture (SOA) using asynchronous request/response pattern
- transport: via TCP in binary or web based
- data model: more than hierarchy of files/folder/registers, object oriented nodes that can send meta information and data
- build in models for:
. data access
. alarm conditions
. historic data and events
. programs (= methods)
. device description (= meta data, properties)
- expandability via profiles:
. DI = device integration
. DA = data access
......@@ -21,6 +28,35 @@ https://en.wikipedia.org/wiki/OPC_Unified_Architecture
. HDA = historical data access
- security
- authentication
- authorisation
- encryption
Modbus looks like MM protocol, with coils (= releis = bit) and registers (words), but has
limited ranges.
XML for data language
RTD = real time data
OPC-UA tag = value + properities = information
properties : units, significatn digits, thresholds
HMI = Human Machine Interface
|
|
|
OPC-UA client (consume data)
|
|
|
OPC-UA server (expose data of device)
|
|
|
driver
|
|
|
field devices (one or many data points)
Needed:
- OPC-UA SDK (software development kit)
......
STAT L2 ICDs for SDP
- L1 ICD 11108 STAT-NW
- L1 ICD 11109 STAT-CEP
- L2 ICD 11211 SC-SDP
- L2 ICD 11207 RCU2S-SDP
- L2 ICD 11209 STF-SDP
- L2 ICD 11218 SDP-STCA
ICD interface types:
m - Mechanical (structural, loading, tooling, etc)
f - Fluid (pneumatic, cooling, heating, condensate, fuels, lubricants, waste, exhaust, feedstocks etc)
......@@ -11,6 +21,10 @@ ICD interface types:
h - Human-Machine Interface (special combination of some of the above)
###################################################################################################
# L1 ICD 11108 STAT-NW
UDP link control
- flow control = end-to-end
- congestion control = peer-to-peer within the network
......@@ -27,6 +41,18 @@ UDP link control
> ping <IP address> # to find MAC address for IP address ?
###################################################################################################
# L1 ICD 11109 STAT-CEP
Included:
A) Beamlet data
B) Transient buffer read out
Not included:
. SST, BST, XST, because these are for monitoring and calibration, not for science data
. Subband offload for AARTFAAC2.0 will have own EICD
LFAA-CSP_Low : OSI (Open Systems Interconnection) layers
7 Application : Not applicable, this is the level where the STAT and CEP products each perform their
......@@ -70,17 +96,9 @@ LFAA-CSP_Low : OSI (Open Systems Interconnection) layers
- Ethernet standard [IEEE Std 802.3-2015], 40 GbE
1 Physical :
- Ethernet standard [IEEE Std 802.3-2015], 40 GbE
L1 ICD 11109 : STAT - CEP
. Beamlet data
. Transient buffer read out
Not included:
. SST, BST, XST, because these are for monitoring and calibration, not for science data
. Subband offload for AARTFAAC2.0 will have own EICD
STAT-CEP Beamlet data interface:
A) STAT-CEP Beamlet data interface:
- VERSION_ID 8b
. 2,3,4 for LOFAR1
......@@ -154,4 +172,60 @@ STAT-CEP Beamlet data interface:
- Data
. X, Y paired dual polarization beamlets
B) STAT-CEP Transient data read out
###################################################################################################
# L2 ICD 11211 SC-SDP
1) Uniboard2 board
a) I2C via PCC
b) actuators, sensors
2) Firmware
a) 1GbE per 4 FPGA / 10GbE at SCU
b) FPGA register access via Gemini Protocol/UDP/IPv4
c) FPGA register map:
- BSP : info, PPS, flash
- ring
- adc : JESD, WG, timestamp, input buffer, data buffer (DB), statistics (mean, power, histogram)
- Fsub : subband weights, SST
- XC : subband selection, XST
- BF : subband selection, beamlet weigths, BST, beamlet output header
- TB : recording, direct DDR4 access, reading
- TDET : setup, release, info
- SO : subband selection, subband offload header
d) Concepts
- firmware runs on array of FPGA[]
- all FPGA run the same firmware image, they can behave differently dependent on:
. their static location (node ID)
. the dynamic setup via M&C
- no need to be aware of UniBoard2 or that there are 4 FPGAs per UniBoard2
d) OPC-UA representation (by OPC-UA servers)
- arrays of signal inputs, subbands, beamlets, no need to be aware of FPGA[] array
- per device, there are devices for (fig 3.3.1-1):
. HBA analogue beamformer
. Station digital beamformer
. Subband correlator
. Transient buffer
. Transient detection
###################################################################################################
# L2 ICD 11207 RCU2S-SDP
###################################################################################################
# L2 ICD 11209 STF-SDP
###################################################################################################
# L2 ICD 11218 SDP-STCA
\ No newline at end of file
*******************************************************************************
* Netherlandse Code of conduct for research integrety:
*******************************************************************************
Vijf principes:
- Eerlijkheid
- Zorgvuldigheid
- Transparantie
- Onafhankelijkheid
- Verantwoordelijkheid
*******************************************************************************
* SKA experience:
*******************************************************************************
......@@ -15,177 +28,6 @@
. analogue complex gain calibration (temperature dependent)
. beam shape tapering? (fixed)
*******************************************************************************
* LaTeX
*******************************************************************************
- \sigma \sqrt{}
- 4.15 \cdot 10^{15}
- M =
\left[ {begin{array}{cc}
1 & 2 & 3 & 4\\
5 & 6 & 7 & 8\\
\end{array} } \right]
*******************************************************************************
* Run RadioHDL with SVN
*******************************************************************************
echo "Uniboard trunk is selected"
export SVN=${HOME}/svnroot/UniBoard_FP7
#Setup RadioHDL environment for UniBoard2 and and new Uniboard1 applications
. ${SVN}/RadioHDL/trunk/tools/setup_radiohdl.sh
# Support old UniBoard environment (including Aarfaac and Paasar)
. ${SVN}/RadioHDL/trunk/tools/setup_unb.sh
*******************************************************************************
* Run RadioHDL with GIT
*******************************************************************************
> cd ~/git/hdl
> . ./init_hdl.sh # setup development environment for hdl/
# hdl/libraries, hdl/boards and hdl/applications are developed simultaneously and therefor in one git hdl/ repository
# automatically also sources ../radiohdl/init_radiohdl.sh if necessary
> compile_altera_simlibs unb1 # creates build/unb1/hdl_libraries_ip_stratixiv.txt
# creates build/quartus/<tool version> simulation models that need to be moved to /home/software/modelsim_altera_libs
> generate_ip_libs unb1 # creates build/unb1/qmegawiz/
# creates build/unb1/quartus_sh --> empty dir, why is it there?
> quartus_config unb1 # creates build/unb1/quartus/<hdllib libraries> for synthesis
# creates build/unb1/quartus/technology_select_pkg.vhd
> modelsim_config unb1 # creates build/unb1/modelsim/<hdllib libraries> for simulation
# creates build/unb1/modelsim/modelsim_project_files.txt for Modelsim commands.do
# creates build/unb1/modelsim/technology_select_pkg.vhd
> run_qsys unb1 unb1_minimal_qsys
*******************************************************************************
* GIT workflow
*******************************************************************************
difftool ?
mergetool ?
* Pro Git book by Scott Chacon: https://git-scm.com/book/en/v2
* YouTube : David Mahler part 1,2,3
Part 1:
# After GIT install
git version
git config --global http://user.name "EricKooistra"
git config --global http://user.email "erkooi@gmail.com"
git config --list
touch .gitignore # create .gitignore if it does not already exist
.gitignore # file with working tree dirs and files to ignore, must also be commited
# To start a repo
cd ~/git
git init # start new repo at this dir, creates .git/
git clone # get and start with existing repo
git clone git@git.astron.nl:desp/args.git
git status # what is in stage area and what is modified
Three areas:
* working tree # local directory tree
| git add
v
* staging area (index)
|
v git commit
* history # .git repository with entire commit graph
# To use a repo
git add <dir>/<file> # add to stage area, set for commit. Cannot add empty dir, need empty file in it
git add . # add all new and modified to stageing area
git diff # diff between file in working tree and staging area
git diff --staged # diff between file in staging area and history
git rm <filename> # remove file from working tree and stage the delete
git checkout -- <filename> # revert a working tree change
git reset # clear stage area
git reset HEAD <filename> # revert staged change
git log -- <filename> # show history of file
git checkout <version hash> -- s2 # retrieve file from history into staged area and working tree
Part 2:
git commit -m "" # commit what is in stage area
git commit -a -m "" # add to stage area and commit what is in stage area
alias graph="git log --all --decorate --oneline --graph"
git branch <branch name> # creat branch
git branch # show branches
git checkout <branch name> # change working tree and stage area to branch
git checkout master
git merge <branch name> # Fast forward merge of branch name to master if there is a direct path
# by moving master to branch name, this is when there have been no updates
# on the master branch since the branch was created.
# Three way merge combine the differences of the branch and the master
# compared to their common version, this can lead to merge conflicts if
# changes on both branches occur at same parts of a file.
git branch --merged # show branches that ghave been merged to master
git branch -d <branch name> # remove branch
git checkout <commit hash> # detached HEAD because it points to a version not a branch
git branch <branch name> # start a branch from the commit hash, HEAD is attached again
# Stash area to store working tree
git stash save "comment" # store working tree and stage area to get a clean
git stash list # show all stashes
git stash apply <label> # restore stash
git stash apply # restore last stash
Part 3: Remote repositories (Github, Gitlab, Bitbucket, ...)
create repo on Github
http://README.md # md = mark down
git clone <url:.../<repo name>.git> # get copy from url
cd <repo name>
git config --local http://user.name "EricKooistra"
git config --local http://user.email "erkooi@gmail.com"
git remote # origin
git remote -v # full url
# To align with remote repo
# update from remote
git status # shows also origin/master, but not live
git fetch origin
git status # shows also origin/master, now with latest remote
git merge origin/master
git pull # get latest from remote repo, combines fetch and merge
# upload to remote
git push
git push origin master # put local repo to remote repo
# On Github fork is a copy of the a repo in Github to get a repo on your account
git clone <url of fork> # get copy of fork repo, will be origin
git remote add upstream <url of original repo on Github> # will be upstream
git fetch upstream
git status
# commit local change on branch
# git push origin <branch name> # push to my fork repo on Github
# pull request on Github
# delete branch and fetch npstream if the pull request was accepted
git remote remove <remote name> # remove a remote repo
*******************************************************************************
* Confluence:
*******************************************************************************
- space tools menu links onder om secties the ordenen.
*******************************************************************************
* RadioHDL
*******************************************************************************
Open issues:
- Central HDL_IO_FILE_SIM_DIR = build/sim --> Project local sim dir
- avs_eth_coe.vhd per tool version? Because copying avs_eth_coe_<buildset>_hw.tcl to $HDL_BUILD_DIR
copies the last <buildset>, using more than one buildset at a time gices conflicts.
*******************************************************************************
* To do:
......@@ -220,6 +62,8 @@ Open issues:
- Use GIT
- Understand AXI4 streaming (versus avalon, RL =0)
. wrap between AXI4 - Avalon for MM and DP
- Global reset only on sosi info not on sosi data
- Use ARGS to define peripherals
. check docs and article
- Learn how gmi_minimal HDL code works to prepare for porting to unb2b_minimal_gmi
......@@ -232,8 +76,6 @@ Open issues:
. cause of reboot (power cycle, overtemperature, ...)
- RCU2S-SDP signal input allocation:
Decided:
. LBA and HBA on seperate RCUs and also on seperate subracks and on independent rings, so SDP is independent
......@@ -260,7 +102,6 @@ Open issues:
*******************************************************************************
* System engineering:
*
*******************************************************************************
- Requirements,
......@@ -269,6 +110,7 @@ Open issues:
Requirements map to functions at any level
- Products
- ICDs
- Roles
Polarion:
- Maintain hierarchy of functions and link them to products
......@@ -482,17 +324,20 @@ STIN
*******************************************************************************
- H4
. SDP dynamisch bereik figure versimpelen + formule + simuleer processing gain van FFT voor real input
. SDP dynamisch bereik figure versimpelen + formule + simuleer processing gain van FFT voor real
input
. Check timing dither with PK, because this would require detailed timing control for SDP and RCU2
. In SDP JESD details maybe too detailed design, purpose here is to explain JESD in own words to understand it without
having to read references.
. In SDP JESD details maybe too detailed design, purpose here is to explain JESD in own words to
understand it without having to read references.
. Check that +48 v is mentioned (not -48 V)
. io_ddr in FN beamformer uses 26 M9K waarvan 10 in de IP. De IP gebruikt ook 1 M144K (~= 16 M9K). De FN beamformer gebruikt 1 DDR3 module.
--> voor twee DDR modules zou je ongeveer (26 + 10) * 2 = 72 block RAM verwachten, terwijl TBB-base in LOFAR1 maar 6 gebruikt volgens MP
syntesis report.
. io_ddr in FN beamformer uses 26 M9K waarvan 10 in de IP. De IP gebruikt ook 1 M144K (~= 16 M9K).
De FN beamformer gebruikt 1 DDR3 module.
--> voor twee DDR modules zou je ongeveer (26 + 10) * 2 = 72 block RAM verwachten, terwijl
TBB-base in LOFAR1 maar 6 gebruikt volgens MP syntesis report.
. Timimg fixed op interne sync of op single or periodic timestamp set by SC.
- let SDP send event for each interne sync, to help SC align its time critical control and monitoring
- possibly use linear interpolation for BF weights to relieve SC from tigth 1 Hz control and to allow fast tracking of satellites
- let SDP send event for each interne sync, to help SC align its time critical M&C
- possibly use linear interpolation for BF weights to relieve SC from tigth 1 Hz control and to
allow fast tracking of satellites
- define timestamp as 32b seconds and 32b blocks within second
......@@ -501,9 +346,11 @@ STIN
PK:
. Write design decision on dithering (meant to make ADC effectively more linear):
1) none
2) level using additative CW --> suppress RFI harmonics already at ADC by making the ADC sampling more linear by crossing more levels
3) time using delays --> does this work anyway, because the digital BF already suppresses RFI from all directions in which it does not point.
The dithering makes that it looks like the RFI comes from all directions, this does not help, does it?
2) level using additative CW --> suppress RFI harmonics already at ADC by making the ADC
sampling more linear by crossing more levels
3) time using delays --> does this work anyway, because the digital BF already suppresses RFI
from all directions in which it does not point. The dithering makes that it looks like the
RFI comes from all directions, this does not help, does it?
PDR:
- Polarization correction at station via subband weights, via BF weights?
......@@ -516,11 +363,12 @@ PDR:
- TB size
- S_sub_bf = 488
- Subband weights update rate > 10 min is sufficient?
- There are >= 488 independent Station beams per polarization. This also implies that the polarizations can have independent
Station beams, so different frequencies and pointings. In LOFAR 1.0 a subband and a beamlet are defined as a X,Y tuple.
For LOFAR 2.0 the beamlets are still output as tuples but the X and Y are treated independently, so beamlet index i for X
may use a different subband frequency and pointing than beamlet index i for Y. Therefore in LOFAR 2.0 define a subband and
a beamlet per single polarization.
- There are >= 488 independent Station beams per polarization. This also implies that the
polarizations can have independent Station beams, so different frequencies and pointings. In
LOFAR1 a subband and a beamlet are defined as a X,Y tuple. For LOFAR2.0 the beamlets are still
output as tuples but the X and Y are treated independently, so beamlet index i for X may use
a different subband frequency and pointing than beamlet index i for Y. Therefore in LOFAR2.0
define a subband and a beamlet per single polarization.
- How to continue SE after PDR:
. Update ADD with OAR answers and internal remarks from team
. Mapping of function tree, L2 requirements and L3 products in ADD
......@@ -528,7 +376,8 @@ PDR:
. Add station functions from ADD to function tree in Polarion?
- what is the purpose of the function tree, does it provide a check list?
- how deep will the function tree go, till the lowest level products?
. Do we need more SE views then that we already have with states/modes, requirements, functions and products?
. Do we need more SE views then that we already have with states/modes, requirements, functions
and products?
. ICDs
. L4 requirements?
. L4 products?
......
......@@ -229,17 +229,18 @@ BEGIN
);
---------------------------------------------------------------------------------------
-- TX: FIFO: dp_clk -> tx_clk and with fill level so we can deliver packets to the MAC fast enough
-- TX: FIFO: dp_clk -> tx_clk and with fill level/eop trigger so we can deliver packets to the MAC fast enough
---------------------------------------------------------------------------------------
gen_dp_fifo_fill_dc : FOR i IN 0 TO g_nof_macs-1 GENERATE
u_dp_fifo_fill_dc : ENTITY dp_lib.dp_fifo_fill_dc
gen_dp_fifo_fill_eop : FOR i IN 0 TO g_nof_macs-1 GENERATE
u_dp_fifo_fill_eop : ENTITY dp_lib.dp_fifo_fill_eop
GENERIC MAP (
g_technology => g_technology,
g_data_w => c_xgmii_data_w,
g_empty_w => c_tech_mac_10g_empty_w,
g_use_empty => TRUE,
g_fifo_fill => g_tx_fifo_fill,
g_fifo_size => g_tx_fifo_size
g_technology => g_technology,
g_use_dual_clock => TRUE,
g_data_w => c_xgmii_data_w,
g_empty_w => c_tech_mac_10g_empty_w,
g_use_empty => TRUE,
g_fifo_fill => g_tx_fifo_fill,
g_fifo_size => g_tx_fifo_size
)
PORT MAP (
wr_rst => dp_rst,
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment