642 Works

Fuzzy Cluster Forests R Code

Abdelkarim Ben Ayed, Mohamed Ben Halima & Adel M. Alimi
An improved version of the original cluster forest algorithm by replacing the base clustering algorithm (k-means) with fuzzy C-means. Experiments on eight real-world data sets and compared to other clustering algorithms shows that fuzzy cluster forest outperforms most other clustering algorithms.

Multi-Scale Patch-Based Image Restoration - Super Resolution

Vardan Papyan & Michael Elad
Many image restoration algorithms in recent years are based on patch-processing. The core idea is to ecompose the target image into fully overlapping patches, restore each of them separately, and then merge the results by a plain averaging. This concept has been demonstrated to be highly effective, leading often times to state-of-the-art results in denoising, inpainting, deblurring, segmentation, and other applications. While the above is indeed effective, this approach has one major flaw: the prior...

Results Analysis: Modeling Non-Uniform Memory Access on Large Compute Nodes with the Cache-Aware Roofline Model

Brice Goglin, Emmanuel Jeannot, Aleksandar Ilic, Leonel Sousa & Nicolas Denoyelle
The Cache-Aware Roofline Model (CARM) is an insightful, yet simple model designed to address this issue. It provides feedback on potential applications bottlenecks and shows how far is the application performance from the achievable hardware upper-bounds. However, it does not encompass NUMA systems and next generation processors with heterogeneous memories. Yet, some application bottlenecks belong to those memory subsystems, and would benefit from the CARM insights. In this paper, we fill the missing requirements to...

LOWESS analysis for visualization of GEDDs

Long H. Do, William C. Mobley & Nishant Singhal
In our published finding (Do et al. 2015), we find evidence against domains of genome-wide gene dysregulation (GEDDs) in the Down syndrome mouse model and DS human iPSCs as compared to original findings from Letourneau et al. 2014. We have supplied the following script to allow other investigators to quickly visualize and compare their RNAseq gene expression datasets to search for corresponding GEDDs. The script performs a locally weighted scatterplot smoothing (LOWESS) analysis of RNAseq...

Approximate Drop Volume from SEM Image

Atif Anwer
Objective: Approximate volume of drops in image taken under a Scanning Electron Microsocpe (SEM) The objective was to detect and approximate the volume of drops in a Pipette/Test Tube image, taken under a Scanning Electron Microsocpe (SEM). The source was a research problem for a PostGrad Student/friend in Chemical Engineering Department, UTP. The Matlab code does the following: Detects and idendifies drops in the SEM image Draws a boundary of the drops detected with centers...

Adaptive Surround Modulation

Arash Akbarinia
In this article we have addressed the problem of colour constancy through a biologically-inspired centre-surround modulation. In our dynamic algorithm we account for the contrast variability of receptive fields typical of our visual cortex.

Delta-Ramp Encoder for Amplitude Sampling and its Interpretation as Time Encoding (codes)

Pablo Martinez-Nuevo, Hsin-Yu Lai & Alan V. Oppenheim
These are the codes to generate Figure 8-11 in the paper, P. Martınez-Nuevo, H.-Y. Lai, A.V. Oppenheim, "Delta-Ramp Encoder for Amplitude Sampling and its Interpretation as Time Encoding", IEEE Transactions on Signal Processing, 2019.

Revisiting Antarctic ice loss due to marine ice cliff instability

Tamsin Edwards
This Code Ocean capsule reproduces the main analysis of the below paper, which emulates and recalibrates the Antarctic ice sheet model ensemble projections of DeConto and Pollard (2016), Nature. Tamsin L. Edwards, Mark Brandon, Gael Durand, Neil R. Edwards, Nicholas R. Golledge, Philip B. Holden, Isabel Nias, Antony J. Payne, Catherine Ritz and Andreas Wernecke (2019), Revisiting Antarctic ice loss due to marine ice cliff instability, Nature.

Tutorial for the FMI++ Python Interface

Edmund Widl
This capsule provides simple test cases for exporting ns-3 scripts as FMUs for Co-Simulation. The usage of the exported FMUs is demonstrated with the help of simple Python scripts.

Features extraction based on Beta-Elliptic Model and Fuzzy Elementary Perceptual Codes

Thameur Dhieb, Sourour Njah, Houcine Boubaker, Wael Ouarda, Mounir Ben Ayed & Adel M. Alimi
The presented code consists of the preprocessing and the segmentation of online handwriting into a sequence of Beta strokes in a first step. Then, from each Beta stroke, we extract a set of static and dynamic features using four features extraction techniques based on the Beta-Elliptic model and the Fuzzy Elementary Perceptual Codes. Next, all the segments which are composed of N consecutive Beta strokes are categorized into groups and subgroups according to their position...

A Multivariate Regime-switching GARCH Model

Markus Haas & Ji-Chun Liu
We consider a multivariate Markov-switching GARCH model which allows for regime-specific volatility dynamics, leverage effects, and correlation structures. Conditions for stationarity and expressions for the moments of the process are derived. A Lagrange Multiplier test against misspecification of the within-regime correlation dynamics is proposed, and a simple recursion for multi-step-ahead conditional covariance matrices is deduced. As an application, we model the dynamics of the joint distribution of global stock market and real estate equity returns....

Matrix Completion with Temporal Constraint

Andy J Ma, Jacky CP Chan, Frodo KS Chan, Pong C Yuen, Terry CF Yip, Yee-Kit Tse, Vincent WS Wong & Grace LH Wong
Regular medical records are useful for medical practitioners to analyze and monitor patients health status especially for those with chronic disease. However, such records are usually incomplete due to unpunctuality and absence of patients. In order to resolve the missing data problem over time, tensor-based models have been developed for missing data imputation in recent papers. This approach makes use of the low-rank tensor assumption for highly correlated data in a short-time interval. Nevertheless, when...

Minimum Mean Brightness Error Bi-Histogram Equalization for Real Time Image Enhancement

Rajiv Tripathi & Tirupathi Raju Kanumuri
Minimum mean brightness error bi histogram equalization is used for image enhancement because of it’s optimal brightness preservation capability. Due to calculation of absolute mean brightness error at each gray level this technique need to repeat bi histogram equalization for all the gray levels present in the image. Computation time for this technique increase drastically with the size of image, which is not suitable for real time applications. In this paper, an algorithm has been...

Improving Energy Efficiency of Field-Coupled Nanocomputing Circuits by Evolutionary Synthesis

Marco A. Ribeiro, Iago A. Carvalho, Jeferson F. Chaves, Gisele L. Pappa & Omar P. Vilela Neto
Moore's law provoked decades of advances in computer's performance due to transistor's evolution. Despite all success in its improvement, current technology is reaching its physical limits and some replacements are the focus of investigations, such as the Field-Coupled Nanocomputing devices. These devices achieve information transfer and computation via local field interactions, reaching ultra-low power consumption. Nevertheless, there exists a hard energy limit related to the Laws of Thermodynamics that bounds any digital evaluation. To reduce...

FingerNet: A Unified Deep Network for Fingerprint Minutiae Extraction

Yao Tang, Yuhang Liu & Jufu Feng
FingerNet is a universal deep ConvNet for extracting fingerprint representations including orientation field, segmentation, enhanced fingerprint and minutiae. It can produce reliable results on both rolled/slap and latent fingerprints.

Supplementary code for: Simple Generation of Gamma, Gamma-Gamma and K Distributions with Exponential Autocorrelation Function

Dima Bykhovsky
This code provides an example of numerical generation of random process with gamma distribution and exponential auto-correlation function.

Cognition Open Data (COD) Reproducibility Report: JcuWB

Tom Hardwicke
This is one of 35 analytic reproducibility reports arising from the Cognition Open Data (COD) project. JcuWB is the ID code used to refer to this specific report. The COD project involved attempting to reproduce a subset of key target outcomes reported in 35 articles published in the journal Cognition by repeating the original analyses upon the original data. For more details, please visit the Open Science Framework project: https://osf.io/wn8fd/.

Designing high-resolution time–frequency and time–scale distributions for the analysis and classification of non stationary signals

Boualem Boashash & Samir Ouelha
The scripts provided here can be used to reproduce most of the figures that appear in the following paper: "Designing high-resolution time–frequency and time–scale distributions for the analysis and classification of non stationary signals". The main feature selection and classification codes as well as the TFSAP 7.1 toolbox source code can also be found on the following GitHub repository: https://github.com/Prof-Boualem-Boashash/TFSAP-7.1-software-package

Simulated annealing lifting Quasi-cyclic low-density parity-check (QC-LDPC)

Vasiliy Usatyuk & Ilya Vorobyev
Source code for 'Simulated Annealing Method for Construction of High-Girth QC-LDPC Codes' by Vasiliy Usatyuk und Ilya Vorobyev, 41st International Conference on Telecommunications and Signal Processing (TSP) 2018, 4-6 Jule, Athens, Greece. It constructs regular and irregular QC-LDPC codes from protograph with required minimal EMD value

Improving contig binning of metagenomic data using d2S oligonucleotide frequency dissimilarity

Ying Wang, Kun Wang, Yang Young Lu & Fengzhu Sun
d2SBin is easy-to-use contig-binning improving tool, which adjusted the contigs among bins based on the output of any existing binning tools. The tool is taxonomy-free only on the k-tuples for single metagenomic sample. d2SBin is based on the mechanism that relative sequence compositions are similar across different regions of the same genome, but differ between genomes. Current tools generally used the normalized frequency of k-tuple directly, which actually is the absolute instead of relative sequence...

6mer seed toxicity in tumor suppressive microRNAs: toxic seed and nucleotide distributions, figure 3d, 3e, supplementary figure 5b​

Elizabeth T. Bartom
This capsule accompanies the paper "6mer seed toxicity in tumor suppressive microRNAs" and shows the landscape of toxic seeds, non-toxic seeds, and all four single nucleotides across different gene boundaries. Multiple genes are aggregated to show the general patterns in these meta-plots.

Unmixing Signal and Noise for Photon-Efficient Active Imaging

Joshua Rapp
We introduce a new approach to depth and reflectivity estimation that emphasizes the unmixing of contributions from signal and noise sources. At each pixel in an image, short-duration range gates are adaptively determined and applied to remove detections likely to be due to noise. For pixels with too few detections to perform this censoring accurately, data are combined from neighboring pixels to improve depth estimates, where the neighborhood formation is also adaptive to scene content.

Minimal transparency is undermining the credibility of the social sciences

Tom Hardwicke, Joshua Wallach, Mallory Kidwell & John Ioannidis
Policy decisions based on flawed research can have substantial economic, social, and individual costs. Recently there have been calls for improved transparency in the social sciences to ensure that a robust evidence base is available to policy makers. New initiatives such as the Transparency and Openness Promotion (TOP) Guidelines have been launched in an effort to improve the transparency and credibility of the scientific literature. However, the cumulative effect of these initiatives is currently unknown....

Cognition Open Data (COD) Reproducibility Report: IeIFy

Tom Hardwicke
This is one of 35 analytic reproducibility reports arising from the Cognition Open Data (COD) project. IeIFy is the ID code used to refer to this specific report. The COD project involved attempting to reproduce a subset of key target outcomes reported in 35 articles published in the journal Cognition by repeating the original analyses upon the original data. For more details, please visit the Open Science Framework project: https://osf.io/wn8fd/.

Comparative Metatranscriptomics Workflow (CoMW)

Muhammad Zohaib Anwar, Anders Lanzen, Toke Bang-Andreasen & Carsten Suhr Jacobsen
Comparative Metatranscriptomics Workflow is a standardized and validated workflow to functionally classify quality filtered mRNA reads from metatranscriptomic or total RNA studies generated using NGS short reads. CoMW is used for classification of these reads using assembled contigs to the reference databases provided and cited.

Registration Year

  • 2017
  • 2018
  • 2019

Resource Types

  • Software

Data Centers

  • Code Ocean