توجه: محتویات این صفحه به صورت خودکار پردازش شده و مقاله‌های نویسندگانی با تشابه اسمی، همگی در بخش یکسان نمایش داده می‌شوند.
۱Test Data Compression Architecture for Lowpower VLSI Testing
اطلاعات انتشار: World Applied Sciences Journal، بيست و نهم،شماره۸، ۲۰۱۴، سال
تعداد صفحات: ۴
With the ever increasing integration capability of semiconductor technology, today’s large integrated circuits requires an increasing amount of data for testing which increases test time and elevated requirements of tester memory. Larger test data sizes not only demand high memory requirements, but also a huge increase in the testing time. These remain the bottleneck for testing the whole system. The above disputes are addressed by the method called test data compression, which is reducing the test data volume without affecting the whole performance of the system. ISCAS’89 benchmark circuits are used to compress the test sets and the test vectors are generated by MINTEST vector generation. The main highlight of the paper is to reduce the test data volume and the test data memory by using coding schemes such as variable length coding, fixed length coding and so on.

۲Noise Removal Using Mixtures of Projected Gaussian Scale Mixtures
نویسنده(ها): ،
اطلاعات انتشار: World Applied Sciences Journal، بيست و نهم،شماره۸، ۲۰۱۴، سال
تعداد صفحات: ۷
Denoising of natural images is the fundamental and challenging problem of Image processing. As this problem depends upon the type of noise, amount of noise and the type of images it is not truly a practical approach. Wavelet transform method is used for localized in both frequency and spatial domain. The general de–noising wavelet transform method involves three steps 1) Compute the wavelet decomposition of the image 2) Threshold detail coefficients 3) Compute wavelet reconstruction. Dimension reduction methods search for the manifolds in the high–dimensional space on which the data resides. The technique used in this paper for dimension reduction is Principle Component Analysis (PCA). PCA is a well–known technique to map n–dimensional vectors into k–dimensional vectors. a new statistical model for image restoration in which neighbourhoods of wavelet subbands are modeled by a discrete mixture of linear projected Gaussian Scale Mixtures (MPGSM). In each projection, a lower dimensional approximation of the local neighbourhood is obtained, thereby modeling the strongest correlations in that neighbourhood. The algorithm used is Expectation Maximisation (EM) algorithm.

۳Authentication Verification and Remote Digital Signing Based on Embedded Arm (LPC2378) Platform
نویسنده(ها): ،
اطلاعات انتشار: World Applied Sciences Journal، بيست و نهم،شماره۹، ۲۰۱۴، سال
تعداد صفحات: ۵
In the current digital world, the need for digital signing has increased exponentially. Generally, any electronic documents transmitted electronically or physically needs to verified whether it was tampered or not, technically digital verification\authentication. Digital signing is one of the digital frameworks that could effectively address these issues in an optimized basis. In brief the project can be explained as; we get an analog or digital data that will be given as an input to DSP executable chip (via A\D interface) which performs basic image operations. The processed data would flow through a 32–bit embedded ARM microprocessor and the processor (that performs digital encryption via RSA or IDEA algorithm) and outputs an encrypted solution. By using this system we could be able to transmit the data of preferred length based on the user or application requirement. Further, the digitally encrypted data can be transmitted via Ethernet. The Ethernet incorporated in this system offers dual redundant network and long distance communication function, which can ensure the disturb rejection capability and reliability of the communication network.
نمایش نتایج ۱ تا ۳ از میان ۳ نتیجه