Fusion Middleware

DakhaBrakha

Greg Pavlik - Sun, 2022-08-28 15:05

 Full live performance. I've been enjoying these funk-folk-jazz songs for some time. Allegedly the music is in "old Ukrainian" - I don't speak Ukrainian or the closely related Carpatho-Russian dialect of Transcarpathia, so I can't really discern what that means - but consider the vocals as instrumentation in any case. In any case, this music is great fun at times and always stirs the human emotion. 
--------

A barely related side note: as a case study in creative folk-culture appropriation to modern forms, I recommend the Ukrainian musical film Hutsulka Ksenya. The plot revolves a young American whose late father leaves him his fortune on the condition he marry a Ukrainian woman. Distinctly unenthusiastic he visits the Carpathian region (presumably Zakarpattia Oblast, from which the Pavlik family emigrated) and falls for a young Hutsul woman.... I won't say more, its a terribly creative film and thoroughly enjoyable.

Hutsuls, as an aside, are not strictly speaking Ukrainians in the ethno-cultural sense, but in the broader sense of Ukraine as a multi-cultural nation. 

Isaac of Syria in Dostoevsky

Greg Pavlik - Wed, 2022-08-17 11:32
My second (and more careful) read of Brothers Karamazov (this time the Pevear/Volokhonsky translation) established it as the most influential piece of world literature I have to date encountered. I regard it as the pinnacle of modern forms, but also amongst the most piercing of critiques of modernity as such. For years I have re-read it in part and in full, as well as numerous commentaries on the novel, Dostovesky's corpus en toto, and of course works on Dosteovsky himself more generally. Of the latter, I cannot imagine a finer literary biography than the work of the late Joseph Frank.

On recurring theme is an attempt to uncover the influences behind the portrait of the staretz Zosima. Many figures have been sited, including the famous Tikon of Zadonsk, of which there is an entire book dedicated to the topic. To my mind, however, the most obvious parallel to the teachings of Zosima is the 7th century ascetic Isaac of Syria. There could not be more clear parallels between his ascetic writings and the long chapter on Zosima's homilies in Karamazov. I liberally quote from two sites with supporting details and illustrations

------------------
There is an interesting connection between St Isaac of Syria and Dostoevsky. The latter owned an 1858 edition of the Slavonic translation of the Homilies by St Paisius Velichkovsky (Victor Terras, A Karamazov Companion: Commentary on the Genesis, Language, and Style of Dostoevsky’s Novel [Madison: U of Wisconsin, 1981], p. 22). Furthermore, Dostoevsky mentions St Isaac’s Ascetical Homilies by name twice in The Brothers Karamazov. The first time is in Part I, Book III, Chapter 1, ‘In the Servants’ Quarters’, where the narrator observes that Grigory Vasilievich, Fyodor Karamazov’s manservant, ‘somewhere obtained a copy of the homilies and sermons of “Our God-bearing Father, Isaac the Syrian”, which he read persistently over many years, understanding almost nothing at all of it, but perhaps precisely for that reason prizing and loving it all the more’ (Fyodor Dostoevsky, The Brothers Karamazov, trans. Richard Pevear and Larissa Volokhonsky [NY: Vintage, 1991], p. 96). Dostoevsky then mentions the book again in 4.11.8, this time in the rather more sinister context of Ivan’s third meeting with Smerdyakov, when the latter 'took from the table that thick, yellow book, the only one lying on it, the one Ivan had noticed as he came in, and placed it on top of the bills. The title of the book was The Homilies of Our Father among the Saints, Isaac the Syrian. Ivan Fyodorovich read it mechanically' (Dostoevsky, p. 625).

But more importantly, Victor Terras has pinpointed a number of St Isaac’s teachings that make a definite appearance in the words of Elder Zosima in II.VI.3, especially in (g) ‘Of Prayer, Love, and the Touching of Other Worlds’ (Dostoevsky, pp. 318-20), and (i) ‘Of Hell and Hell Fire: A Mystical Discourse’ (Dostoevsky, pp. 322-4). Terras quotes the following passage from ‘Homily Twenty-Seven’ as being ‘important for the argument of The Brothers Karamazov’ (Terras, p. 23):


Sin, Gehenna, and Death do not exist at all with God, for they are effects, not substances. Sin is the fruit of free will. There was a time when sin did not exist, and there will be a time when it will not exist. Gehenna is the fruit of sin. At some point in time it had a beginning, but its end is not known. Death, however, is a dispensation of the wisdom of the Creator. It will rule only a short time over nature; then it will be totally abolished. Satan’s name derives from voluntarily turning aside [the Syriac etymological meaning of satan] from the truth; it is not an indication that he exists as such naturally. (Ascetical Homilies, p. 133)
Terras may, however, be on the wrong trail with this particular passage, though not perhaps with the rest of his parallels, since according to a note in the translation, this particular homily only exists in Syriac (Ascetical Homilies, p. 133), and does not appear to have been available in any translation Dostoevsky would have read (Introduction, Ascetical Homilies, pp. lxxvi-lxxvii). Another interesting, though less important, discrepancy, is that Pevear and Volokhonsky, in their note on the name of St Paisius (he is referenced in I.I.5 [Dostoevsky, p. 27], and footnoted on p. 780 of Pevear’s and Volokhonsky’s translation), date Dostoevsky’s edition of the Elder’s translation of St Isaac to 1854 rather than 1858. Furthermore, J.M.E. Featherstone lists among St Paisius's works, Svjatago otca našego Isaaka Sirina episkopa byvšago ninevijskago, slova duxovno-podvižničeskija perevedennyja s grečeskago.... (Moscow, 1854), thus making Pevear and Volokhonsky's date more likely, it would seem ('Select Bibliography', The Life of Paisij Velyčkovs'kyj, trans. J.M.E. Featherstone [Cambridge, MA: Harvard U, 1989], p. 163 ).

I just wanted to highlight briefly this interesting connection. At an even deeper level, however, it has been picked up on, for one, by Archimandrite Vasileios of Iveron. Having considered the ‘artistic’ gifts of St Isaac and the spiritual insight of Dostoevsky, he concludes, ‘Thus, whether you read Abba Isaac, or Dostoevsky, in the end you get the same message, grace and consolation’ (‘Από τον Αββά Ισαάκ’, p. 100).


source: http://logismoitouaaron.blogspot.com/2009/02/this-glory-of-orientst-isaac-syrian.html
------------------

Though the teachings of Elder Zosima from Dostoevsky’s Brothers Karmazov seem exotic to many western readers and possibly unorthodox, they in fact show a remarkable similarity to those of a favorite 7th century eastern saint, St Isaac the Syrian.   We know that Dostoevsky owned a newly-available translation of St Isaac’s Ascetical Homilies, and this volume is in fact mentioned by name twice in the novel, though in seemingly inconsequential contexts.  Dostoevsky was no doubt deeply affected by the saint’s spirituality, and I think Zosima’s principle views in fact reflect and are indebted to those of St Isaac.  Below I will list some of these distinctive views, with illustrating quotes from both the fictional Elder Zosima and St Isaac himself. (And note: these were simply the quotes that I could find very easily; I’m sure more digging would find even more striking parallels)

Love for all creation:

Elder Zosima: “Love God’s creation, love every atom of it separately, and love it also as a whole; love every green leaf, every ray of God’s light; love the animals and the plants and love every inanimate object. If you come to love all things, you will perceive God’s mystery inherent in all things; once you have perceived it, you will understand it better and better every day.  And finally you will love the whole world with a total, universal love.”

St Isaac: “What is a merciful heart? It is a heart on fire for the whole of creation, for humanity, for the birds, for the animals, for demons, and for all that exists. By the recollection of them the eyes of a merciful person pour forth tears in abundance. By the strong and vehement mercy that grips such a person’s heart, and by such great compassion, the heart is humbled and one cannot bear to hear or to see any injury or slight sorrow in any in creation. For this reason, such a person offers up tearful prayer continually even for irrational beasts, for the enemies of the truth, and for those who harm her or him, that they be protected and receive mercy. And in like manner such a person prays for the family of reptiles because of the great compassion that burns without measure in a heart that is in the likeness of God.”

 Responsibility for all:

Elder Zosima: “There is only one salvation for you: take yourself up, and make yourself responsible for all the sins of men. For indeed it is so, my friend, and the moment you make yourself sincerely responsible for everything and everyone, you will see at once that it is really so, that it is you who are guilty on behalf of all and for all. Whereas by shifting your own laziness and powerlessness onto others, you will end by sharing in Satan’s pride and murmuring against God. ”

St Isaac: “Be a partaker of the sufferings of all…Rebuke no one, revile no one, not even those who live very wickedly. Spread your cloak over those who fall into sin, each and every one, and shield them. And if you cannot take the fault on yourself and accept punishment in their place, do not destroy their character.”

 Love is Paradise on Earth:

Elder Zosima: “”Gentlemen,” I cried suddenly from the bottom of my heart, “look at the divine gifts around us: the clear sky, the fresh air, the tender grass, the birds, nature is beautiful and sinless, and we, we alone, are godless and foolish, and do not understand that life is paradise, for we need only wish to understand, and it will come at once in all its beauty, and we shall embrace each other and weep”

St Isaac: “Paradise is the love of God, wherein is the enjoyment of all blessedness, and there the blessed Paul partook of supernatural nourishment…Wherefore, the man who lives in love reaps life from God, and while yet in this world, he even now breathes the air of the resurrection; in this air the righteous will delight in the resurrection. Love is the Kingdom, whereof the Lord mystically promised His disciples to eat in His Kingdom. For when we hear Him say, “Ye shall eat and drink at the table of my Kingdom,” what do we suppose we shall eat, if not love? Love is sufficient to nourish a man instead of food and drink.”

Non-literal ‘fire’ of hell:

Elder Zosima: “Fathers and teachers, I ask myself: “What is hell?” And I answer thus: “The suffering of being no longer able to love.”…People speak of the material flames of hell. I do not explore this mystery, and I fear it, but I think that if there were material flames, truly people would be glad to have them, for, as I fancy, in material torment they might forget, at least for a moment, their far more terrible spiritual torment. And yet it is impossible to take this spiritual torment from them, for this torment is not external but is within them”

St Isaac: “As for me I say that those who are tormented in hell are tormented by the invasion of love. What is there more bitter and violent than the pains of love? Those who feel they have sinned against love bear in themselves a damnation much heavier than the most dreaded punishments. The suffering with which sinning against love afflicts the heart is more keenly felt than any other torment. It is absurd to assume that the sinners in hell are deprived of God’s love. Love is offered impartially. But by its very power it acts in two ways. It torments sinners, as happens here on earth when we are tormented by the presence of a friend to whom we have been unfaithful. And it gives joy to those who have been faithful.  That is what the torment of hell is in my opinion: remorse”

source: https://onancientpaths.wordpress.com/2013/07/27/the-elder-zosima-and-st-isaac-the-syrian/

Note for self: Snyk scan from ArgoCD

Pas Apicella - Thu, 2022-06-02 06:30

 Demo

https://github.com/papicella/springbootemployee-api/tree/master/argocd

Demo Job on K8s to perform Snyk IaC Scan

  
apiVersion: batch/v1
kind: Job
metadata:
name: snyk-iac-scan
annotations:
argocd.argoproj.io/hook: PreSync
spec:
ttlSecondsAfterFinished: 600
template:
spec:
containers:
- name: snyk-cli
image: snyk/snyk-cli:npm
command: ["/bin/sh","-c"]
args:
- git clone https://github.com/papicella/springbootemployee-api.git;
snyk auth $SNYK_TOKEN;
snyk iac test springbootemployee-api/argocd/employee-K8s.yaml || true;
env:
- name: SNYK_TOKEN
valueFrom:
secretKeyRef:
name: snyk-token
key: token
restartPolicy: Never
backoffLimit: 0


Categories: Fusion Middleware

Using Pulumi to automate the Snyk Kubernetes integration for containers

Pas Apicella - Mon, 2022-05-30 04:42

Better late than never finally got around to posting this today.

In this blog post, we will walk through the process of using Pulumi, a new open source tool that allows developers to build code in multiple languages like JavaScript, Typescript, Python, and Go to create all that is required to configure the Kubernetes integration in Snyk Container.

Using Pulumi to automate the Snyk Kubernetes integration for containers
https://snyk.io/blog/automate-snyk-kubernetes-pulumi/


Categories: Fusion Middleware

Intentionality Mind and Nature

Greg Pavlik - Mon, 2022-04-18 15:52
Below I quote at length DB Hart on intentionality and the horizon of rational agency. If you take a moment to think about this point, it really ought to be obvious that it is almost certainly correct. And the fact that we can share a meaning and interest in "correctness" as an end - ie, we care about that which is true - underscores the point directly. The only reason this may seem abstruse is the unconscious prejudices we carry from living in a late modern culture founded on reductive models of consciousness that are almost certainly irrational - meaning, wrong. 

"Neither doctrine nor metaphysics need be immediately invoked to see the impossibility of rational agency within a sphere of pure nature; a simple phenomenology of what it is we do when we act intentionally should suffice. The rational will, when freely moved, is always purposive; it acts always toward an end: conceived, perceived, imagined, hoped for, resolved upon. Its every act is already, necessarily, an act of recognition, judgment, evaluation, and decision, and is therefore also a tacit or explicit reference to a larger, more transcendent realm of values, meanings, and rational longings. Desire and knowledge are always, in a single impulse, directed to some purpose present to the mind, even if only vaguely. Any act lacking such purposiveness is by definition not an act of rational freedom. There are, moreover, only two possible ways of pursuing a purpose: either as an end in itself or for the sake of an end beyond itself. But no finite object or purpose can wholly attract the rational will in the latter way; no finite thing is desirable simply in itself as an ultimate end. It may, in relative terms, constitute a more compelling end that makes a less compelling end nonetheless instrumentally desirable, but it can never constitute an end in itself. It too requires an end beyond itself to be compelling in any measure; it too can evoke desire only on account of some yet higher, more primordial, more general disposition of reason’s appetites. Even what pleases us most immediately can be intentionally desired only within the context of a rational longing for the Good itself. If not for some always more original orientation toward an always more final end, the will would never act in regard to finite objects at all. Immanent desires are always in a sense deferred toward some more remote, more transcendent purpose. All concretely limited aspirations of the will are sustained within formally limitless aspirations of the will. In the end, then, the only objects of desire that are not reducible to other, more general objects of desire, and that are thus desirable entirely in and of themselves, are those universal, unconditional, and exalted ideals, those transcendentals, that constitute being’s abstract perfections. One may not be, in any given instant, immediately conscious that one’s rational appetites have been excited by these transcendental ends; I am not talking about a psychological state of the empirical ego; but those ends are the constant and pervasive preoccupation of the rational will in the deepest springs of its nature, the source of that “delectable perturbation” that grants us a conceptual grasp of finite things precisely by constantly carrying us restlessly beyond them and thereby denying them even a provisional ultimacy.

In fact, we cannot even possess the barest rational cognizance of the world we inhabit except insofar as we have always already, in our rational intentions, exceeded the world. Intentional recognition is always already interpretation, and interpretation is always already judgment. The intellect is not a passive mirror reflecting a reality that simply composes itself for us within our experience; rather, intellect is itself an agency that converts the storm of sense-intuitions into a comprehensible order through a constant process of interpretation. And it is able to do this by virtue of its always more original, tacit recognition of an object of rational longing—say, Truth itself—that appears nowhere within the natural order, but toward which the mind nevertheless naturally reaches out, as to its only possible place of final rest. All proximate objects are known to us, and so desired or disregarded or rejected, in light of that anticipated finality. Even to seek to know, to organize experience into reflection, is a venture of the reasoning will toward that absolute horizon of intelligibility. And since truly rational desire can never be a purely spontaneous eruption of the will without purpose, it must exhibit its final cause in the transcendental structure of its operation. Rational experience, from the first, is a movement of rapture, of ecstasy toward ends that must be understood as—because they must necessarily be desired as—nothing less than the perfections of being, ultimately convertible with one another in the fullness of reality’s one source and end. Thus the world as something available to our intentionality comes to us in the interval that lies between the mind’s indivisible unity of apprehension and the irreducibly transcendental horizon of its intention—between, that is, the first cause of movement in the mind and the mind’s natural telos, both of which lie outside the composite totality of nature."

DB Hart, You are Gods. University of Notre Dame Press, April 2022

Mother of Mercy

Greg Pavlik - Tue, 2022-03-01 12:56
I think in the western world, art that invokes sacred spaces is, to put it mildly, less than central to the art industry. But in parts of Europe, this is not entirely the case. One region that has a kind of generational explosion of sacred art is western Ukraine, an area that is culturally at an intersection between central and western Europe (and where part of my own family is from). To the extent there has been attention paid to these artists, it appears to be largely in Catholic journals - in part I suspect because the region is largely "Greek Catholic" in religion (as again were some of my ancestors), a religion which combines elements of Roman Catholicism and Eastern Orthodoxy: one might say a fusion, or, perhaps a "confusion", depending on ones perspective. But it clearly has generated a creative energy that re-considers traditional concepts of beauty and meaning in art forms that have often been bound to very strict boundaries of received archetypes.

One of those artists is the young Ivanka Demchuk, whose work invokes centers on abrupt symbolism that interleaves openness, geometric constructs and sparse, but striking, colorations. Shown here is her work Mother of Mercy, a rendering of the Virgin Mary receiving the prayers of her people.


Note the striking use of red within the circle that surrounds the Holy Mother. Here she seems at once in fully incorporated into the very cosmos that encompasses and surrounds us. Yet the interior red also invokes dried blood, as if she is interceding for - absorbing, really - the pain of her people. Whether intentional or not, it certainly is evocative, especially at this time of trauma and fratricidal war impacting Ukraine this image seems poignant, yet terribly relevant. We need more sacred art, not less.

If you read this, please consider donating to humanitarian and non sectarian relief for Ukraine, especially to help refugees. These posts are around for years, so if you stumble on this some time after it is posted, the need is still there and any aid is meaningful.

 

Find and fix vulnerabilities in your CI/CD pipeline with Snyk and Harness

Pas Apicella - Wed, 2021-12-22 23:41

Integrating the Snyk developer-focused security platform into Harness’ unified delivery pipeline workflow ensures security and compliance testing is part of every release. This allows you to prevent applications with vulnerable dependencies and code from making their way into production. With modern tooling like Snyk and Harness, you can find, fix, and remediate through a CI/CD pipeline and mitigate the risk to the business without affecting your ability to release software quickly.

Created a new Snyk Blog with the harness team as per the link below.

https://snyk.io/blog/find-fix-vulnerabilities-ci-cd-pipeline-snyk-harness/

Categories: Fusion Middleware

Using harness CI to run a series of Snyk Tests

Pas Apicella - Sun, 2021-10-31 18:13

 When DevOps emerged more than 10 years ago, the main focus was to bridge the gaps between Dev and Ops teams by introducing automation to the processes of building, testing and deployment of applications.

As development teams continue to deliver more rapidly and more frequently, security teams are finding it difficult to keep up and often end up being the bottleneck in the delivery pipeline. For this reason, bringing security early into the DevOps process from the outset – in other words, embracing a DevSecOps culture within a business – has become increasingly important.


Applications have changed as cloud-native technologies like Containers, Kubernetes, and the use of Infrastructure as Code technologies like Terraform, Cloudformation, ARM templates are now the norm. These elements are now built and customized by developers and live in their Source Code Management repositories.


In this demo below I show you how Harness CI along with Snyk can help setup a DevSecOps Pipeline well before we even think about deployment.


Steps


Note: We are using the public GitHub repo below


https://github.com/papicella/springbootemployee-api


1. First we need to identify our user token within Snyk. For enterprise customers you can setup service accounts and retrieve a token which we will require later. For non enterprise Snyk accounts you can just use the main User Token. Both methods of how to obtain this are defined below.


Service Accounts

https://docs.snyk.io/features/integrations/managing-integrations/service-accounts


For those without access to service accounts you can obtain your Snyk user Token as follows

https://docs.snyk.io/features/snyk-cli/install-the-snyk-cli/authenticate-the-cli-with-your-account


2. In harness let's define a few secrets one being our Snyk Token we retrieved in step 1, also my GitHub token



3. Our builds need to run somewhere in this example we are using a connector to our K8s cluster which is defined at the "Organization" level within Harness





4. Our first execution step is simply going to package the application. Our Spring Boot application is using maven and has a single pom.xml file. Notice below that we have to use a Docker connector for a "Build-> Run" step , here I am using an account level Docker connector




5. Now we can run our first Snyk Test. In this simple example we don't fail the build at all and ensure we pass a SUCCESS exit code for our Snyk Open Source test which will pick up the one and only pom.xml file in the repo. I am also using Snyk Docker image that includes the Snyk CLI and Maven to perform this test.





6. So jumping right ahead let's quickly take a look at our other 3 tests.


Snyk Code Test



Snyk IaC test




Snyk Container Test




In all these Snyk Tests we ensure nothing fails instead we just report on vulnerabilities. WE could also set our severity threshold as part of our tests as shown below.


--severity-threshold=medium


https://docs.snyk.io/features/snyk-cli/test-for-vulnerabilities/set-severity-thresholds-for-cli-tests


7. In order to run the container test we had to build our container image which was done using a "Build and push image to Dockerhub" step as shown below



8. Now after running a few builds our Overview page on Harness UI nicely summarises what has passed versus what has failed for us.



9. Finally we can view each build as follows






Sample Images


Overview Page



Infrastructure Page - Here we define where are harness delegate running in a K8s cluster which is just one of the options here when it comes to a worker node for the pipeline steps we run



Execution Page - here we define four seperate tests.

  • Snyk Open Source Test
  • Snyk Code Test
  • Snyk IaC Test
  • Snyk Container Test



Pipeline Result Page




For those that wish to see the whole pipeline YAML here it is below:


pipeline: name: employee-api-pipeline identifier: employeeapipipeline projectIdentifier: Springboot_Employee_API orgIdentifier: default tags: {} properties: ci: codebase: connectorRef: pasgithub repoName: springbootemployee-api build: <+input> stages: - stage: name: build employee api identifier: build_employee_api description: "" type: CI spec: cloneCodebase: true infrastructure: type: KubernetesDirect spec: connectorRef: org.GKE namespace: harness-builds execution: steps: - step: type: Run name: Package Application identifier: Package_Application spec: connectorRef: account.harnessImage image: maven:3.8.3-openjdk-11-slim command: mvn -DskipTests -Dsnyk.skip package privileged: false - step: type: Run name: snyk test identifier: snyk_test spec: connectorRef: account.harnessImage image: snyk/snyk-cli:1.745.0-maven-3.5.4 command: |- SNYK_TOKEN=<+secrets.getValue("SNYK_TOKEN")> snyk config set api=$SNYK_TOKEN snyk test || true privileged: false failureStrategies: [] - step: type: Run name: snyk code test identifier: snyk_code_test spec: connectorRef: account.harnessImage image: snyk/snyk-cli:1.745.0-maven-3.5.4 command: |- SNYK_TOKEN=<+secrets.getValue("SNYK_TOKEN")> snyk config set api=$SNYK_TOKEN snyk code test || true privileged: false when: stageStatus: Success failureStrategies: [] - step: type: Run name: snyk IaC test identifier: snyk_IaC_test spec: connectorRef: account.harnessImage image: snyk/snyk-cli:1.745.0-maven-3.5.4 command: |- SNYK_TOKEN=<+secrets.getValue("SNYK_TOKEN")> snyk config set api=$SNYK_TOKEN snyk iac test ./employee-K8s.yaml || true privileged: false when: stageStatus: Success failureStrategies: [] - step: type: BuildAndPushDockerRegistry name: Build Container identifier: Build_Container spec: connectorRef: pasdockerhub repo: pasapples/springbootemployee tags: - harness dockerfile: Dockerfile.harness optimize: true - step: type: Run name: snyk container test identifier: snyk_container_test spec: connectorRef: pasdockerhub image: snyk/snyk-cli:1.745.0-maven-3.5.4 command: |- SNYK_TOKEN=<+secrets.getValue("SNYK_TOKEN")> snyk config set api=$SNYK_TOKEN snyk container test pasapples/springbootemployee:harness || true privileged: false resources: limits: memory: 2048Mi variables: []



More Information


Snyk

https://snyk.io


Harness

https://harness.io/

Categories: Fusion Middleware

Using Buildkite to perform Snyk Open Source and Snyk Code (SAST) tests

Pas Apicella - Tue, 2021-09-14 23:15

Buildkite is a platform for running fast, secure, and scalable continuous integration pipelines on your own infrastructure. In the example below I will run my Buildkite pipeline on my Macbook to perform two Snyk Tests, one for Open-Source dependancies and the other a SAST test of the code itself.

Snyk is an open source security platform designed to help software-driven businesses enhance developer security.

You will need an account on Snyk and Buildkite to follow the steps below.

Steps

1. First in Snyk let's create a Service Account which will be the Snyk token I will use to authenticate with. You can use the Snyk API Token but the service account is all you need to run "Snyk Tests" so makes sense to use that.


2. Next let's store that Service Account token somewhere where I can safely inject that into my pipeline at the appropriate step. In this example I am using "Google Secret Manager" but there are other choices of course.


Note: We will be using the secret NAME shortly "PAS_BUILDKITE_SA_SNYK_TOKEN"

3. You will need a Buildkite agent on your local Infrastructure in my case I using my Macbook so that's done as follows

https://buildkite.com/docs/agent/v3/macos 

pasapicella@192-168-1-113:~/demos/integrations/buildkite$ ./start-agent.sh

   _           _ _     _ _    _ _                                _
  | |         (_) |   | | |  (_) |                              | |
  | |__  _   _ _| | __| | | ___| |_ ___    __ _  __ _  ___ _ __ | |_
  | '_ \| | | | | |/ _` | |/ / | __/ _ \  / _` |/ _` |/ _ \ '_ \| __|
  | |_) | |_| | | | (_| |   <| | ||  __/ | (_| | (_| |  __/ | | | |_
  |_.__/ \__,_|_|_|\__,_|_|\_\_|\__\___|  \__,_|\__, |\___|_| |_|\__|
                                                 __/ |
 https://buildkite.com/agent                    |___/

2021-09-15 11:09:33 NOTICE Starting buildkite-agent v3.32.3 with PID: 50130
2021-09-15 11:09:33 NOTICE The agent source code can be found here: https://github.com/buildkite/agent
2021-09-15 11:09:33 NOTICE For questions and support, email us at: hello@buildkite.com
2021-09-15 11:09:33 INFO   Configuration loaded path=/usr/local/etc/buildkite-agent/buildkite-agent.cfg
2021-09-15 11:09:33 INFO   Registering agent with Buildkite...
2021-09-15 11:09:35 INFO   Successfully registered agent "y.y.y.y.tpgi.com.au-1" with tags []
2021-09-15 11:09:35 INFO   Starting 1 Agent(s)
2021-09-15 11:09:35 INFO   You can press Ctrl-C to stop the agents

4. You're now ready to create a pipeline. A pipeline is a template of the steps you want to run. There are many types of steps, some run scripts, some define conditional logic, and others wait for user input. When you run a pipeline, a build is created. Each of the steps in the pipeline end up as jobs in the build, which then get distributed to available agents.

In the example below our pipeline is created from a GitHub repo and then select the default branch. At that point incoming webhooks are sent to Buildkite by source control providers (GitHub, GitLab, Bitbucket, etc.) to trigger builds, in this scenario we using GitHub


5. Let's go ahead and actually just edit the build steps using YAML. My final YAML is as follows and I explain below why it looks this way but in short I just want to run two snyk tests rather then actually deploy anything for this demo.

steps:
  - commands:
      - "snyk config set api=$$SNYK_SA_TOKEN_VAR"
      - "snyk test --severity-threshold=$$SEVERITY_THRESHOLD"
      - "snyk code test --org=$$SNYK_ORG"
    plugins:
      - avaly/gcp-secret-manager#v1.0.0:
          credentials_file: /Users/pasapicella/snyk/clouds/gcp/buildkite-secrets-gcp.json
          env:
            SNYK_SA_TOKEN_VAR: PAS_BUILDKITE_SA_SNYK_TOKEN
    env:
      SEVERITY_THRESHOLD: "critical"
      SNYK_ORG: "pas.apicella-41p"
    label: "Employee API Snyk Test"

Few things to note here:

  • I am using a GCP secret manager plugin to retrieve my Snyk SA token with a name as follows "PAS_BUILDKITE_SA_SNYK_TOKEN"
  • I am using a Google Service Account JSON so I can authenticate with GCP and retrieve my secret "SNYK_SA_TOKEN_VAR", you will need to use a Service Account with privileges to at least READ from Google Secret Manager
  • I am using some local non sensitive ENV variables which get used at the appropriate time
  • I have three commands of which the first command sets my Snyk API token for the Snyk CLI
  • I have not installed the Snyk CLI because it already exists on my Macbook 
  • I am only looking for my Snyk tests to fail if it finds any CRITICAL issues only
  • I should be running a "mvn package" here but I can still execute a "snyk test" without it for demo purposes as we have a pom..xml
  • I could also build a container in the pipeline from the source code and then run a "snyk container test" as well, in fact I could even run "snyk iac test" against any IaC files in the repo as well
  • If a test fails we can easily run "snyk monitor" to load the results into the Snyk App but for this demo we don't do that

6. Now we can manually run a build or wait for triggering event on our repo, here is some screen shots of what it looks like including some failures where we find vulnerabilities in a separate node.js repo





 

It makes more sense to create a Buildkite plugin for Snyk rather than execute commands using a script and here is an example of one below. Having said that the commands you run to execute a "snyk test" are simple enough to include in the pipeline YML without the need for a plugin here especially if you have infrastructure already setup with the ability to run the "snyk cli". A plugin would be the right approach here though as per the example below.

https://github.com/seek-oss/snyk-buildkite-plugin


Hopefully you have seen how easy it is to continuously avoid known vulnerabilities in your dependencies and code, by integrating Snyk into your continuous integration pipeline with Buildkite.

More Information
Snyk
Categories: Fusion Middleware

Using the Elastic Snyk module to visualize Snyk imported project data with Elastic Kibana

Pas Apicella - Mon, 2021-08-30 00:47

Snyk is an open source security platform designed to help software-driven businesses enhance developer security

Elastic (ELK) stack is a distributed, RESTful search and analytics engine capable of addressing a growing number of use cases. 

In this blog we will use Elastic to visualize our vulnerability data from Snyk using the Elastic Snyk Module

This module is used for ingesting data from the different Snyk API Endpoints. Currently supports these filesets:
  • vulnerabilities fileset: Collects all found vulnerabilities for the related organizations and projects
  • audit fileset: Collects audit logging from Snyk, this can be actions like users, permissions, groups, api access and more.
When you run the module, it performs a few tasks under the hood:
  • Sets the default paths to the log files (but don’t worry, you can override the defaults)
  • Makes sure each multiline log event gets sent as a single event
  • Uses ingest node to parse and process the log lines, shaping the data into a structure suitable for visualizing in Kibana

Here is how to get started with this currently BETA module

Steps 

1. First you will need an account on Snyk App and have imported a few projects so you have some vulnerability data to get started with. Here is an example of some imported projects which already exist in my Snyk App Account.


2. You will need an Elastic Cluster best way to get one of those if you don't have one is to head to the Elastic Cloud Service for a free trial

3. Next we need to install Elastic Filebeat. The quick start guide here is the best way to do that

https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-installation-configuration.html

Couple of things here

  • You need to run this command to enable the snyk module as shown below. In this example the module is already enabled so the output would differ the first time you run this.

$ ./filebeat modules enable snyk
Module snyk is already enabled

  • Make sure you have configured connectivity to your Elastic Cluster as per the doc above. The example below is for the Elastic Cloud Service itself.

cloud.id: "staging:dXMtZWFzdC0xLmF3cy5mb3VuZC5pbyRjZWM2ZjI2MWE3NGJmMjRjZTMzYmI4ODExYjg0Mjk0ZiRjNmMyY2E2ZDA0MjI0OWFmMGNjN2Q3YTllOTYyNTc0Mw=="
cloud.auth: "filebeat_setup:YOUR_PASSWORD"

4. With Elastic Filebeat installed and configured now we can setup the Snyk module to do that we perform the following.

Edit "./modules.d/snyk.yml

The following link shows how to configure the snyk.yml file and what settings are available

https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-module-snyk.html

Example Config to get started with: 

- module: snyk
  audit:
    enabled: true
    var.input: httpjson
    var.audit_type: organization
    var.audit_id: SNYK_ORG_ID
    var.interval: 1h
    var.api_token: SNYK_API_TOKEN
    var.first_interval: 30d

  vulnerabilities:
    enabled: true
    var.interval: 1h
    var.first_interval: 30d
    var.api_token: SNYK_API_TOKEN
    var.orgs:
      - SNYK_ORG_ID_1
      - SNYK_ORG_ID_2

Note: In this example we are obtaining data from 2 organizations

You obtain your ORG_ID's and SNYK_TOKEN_ID from Snyk App as shown below

Account Settings -> General



Settings -> General -> Organization ID (For each orgnization you wish to use)
 


5. At this point we can start Elastic Filebeat as shown below.

$ ./filebeat -e
...
2021-08-30T14:23:48.034+1000 INFO [esclientleg] eslegclient/connection.go:273 Attempting to connect to Elasticsearch version 7.14.0
2021-08-30T14:23:48.163+1000 INFO cfgfile/reload.go:224 Loading of config files completed.
2021-08-30T14:23:48.163+1000 INFO [input.httpjson-cursor] compat/compat.go:111 Input httpjson-cursor starting {"id": "BC01B4DEC1514B32"}
2021-08-30T14:23:48.163+1000 INFO [input.httpjson-cursor] compat/compat.go:111 Input httpjson-cursor starting {"id": "303DFE9AECEEEF55"}
2021-08-30T14:23:48.164+1000 INFO [input.httpjson-cursor] v2/input.go:112 Process another repeated request. {"id": "303DFE9AECEEEF55", "input_source": "https://snyk.io/api/v1/reporting/issues/?page=1&perPage=10&sortBy=issueTitle&order=asc&groupBy=issue", "input_url": "https://snyk.io/api/v1/reporting/issues/?page=1&perPage=10&sortBy=issueTitle&order=asc&groupBy=issue"}
2021-08-30T14:23:48.164+1000 INFO [input.httpjson-cursor] v2/input.go:112 Process another repeated request. {"id": "BC01B4DEC1514B32", "input_source": "https://snyk.io/api/v1/org/yyyy/audit?page=1&sortOrder=ASC", "input_url": "https://snyk.io/api/v1/org/yyyy/audit?page=1&sortOrder=ASC"}
2021-08-30T14:23:49.656+1000 INFO [input.httpjson-cursor] v2/request.go:210 request finished: 0 events published {"id": "BC01B4DEC1514B32", "input_source": "https://snyk.io/api/v1/org/yyyy/audit?page=1&sortOrder=ASC", "input_url": "https://snyk.io/api/v1/org/yyyy/audit?page=1&sortOrder=ASC"}

...

6. If we head to Kibana and go into the "Discover" Page we will see data flowing into the cluster by setting "event.module = snyk" on the "filebeat-*" index pattern as shown below.

Or maybe we want to use Dev Tools itself



7. Finally a Dashboard of some graphs as we monitor the "Snyk Data" coming into the cluster 


Next Steps?

Kibana gives you the freedom to select the way you give shape to your data. With its interactive visualizations, start with one question and see where it leads you!!!. If it was me I would go straight to Kibana Lens. Kibana Lens is an easy-to-use, intuitive UI that simplifies the process of data visualization through a drag-and-drop experience it allowed me to create the dashboard page above in under 3 minutes even without knowing where to start.

https://www.elastic.co/kibana/kibana-lens

Want all your vulnerability data form the Snyk Platform within your Elastic Stack the "Elastic Snyk Module" will do that for you in a few easy steps.

More Information
Snyk

Elastic Snyk Module
Categories: Fusion Middleware

Snyk Container meets Cloud Native Buildpacks: Cloud Native Application Security the right way

Pas Apicella - Thu, 2021-08-26 17:25

Released my first Snyk Blog post on how Snyk Container along with Cloud Native Buildpacks can mitigate the security risks of containerized applications.

https://snyk.io/blog/snyk-container-cloud-native-buildpacks-cloud-native-application-security/



Categories: Fusion Middleware

Taking Snyk Code for Test Drive

Pas Apicella - Thu, 2021-08-19 20:45

Snyk Code is our newest addition to the Snyk platform. Snyk Code uses a revolutionary approach designed to be developer-first. Conventional Static Application Security Testing (SAST) tools are limited by lengthy scans times and poor accuracy, returning too many false positives, and eroding developer trust. Snyk Code makes developer efforts efficient and actionable.

In this short blog post let's take it for a test drive on a large Java Project.

Steps

1. You will need an account on Snyk App sign up for free at snyk.io

2. Once you have an account and are logged in go ahead and enable "Snyk Code" as follows


3. Clone the following GitHub repo as follows.

https://github.com/papicella/CWE-Juliet-TestSuite-Java

$ git clone https://github.com/papicella/CWE-Juliet-TestSuite-Java
Cloning into 'CWE-Juliet-TestSuite-Java'...
remote: Enumerating objects: 12964, done.
remote: Counting objects: 100% (12964/12964), done.
remote: Compressing objects: 100% (969/969), done.
remote: Total 12964 (delta 11931), reused 12964 (delta 11931), pack-reused 0
Receiving objects: 100% (12964/12964), 6.44 MiB | 6.04 MiB/s, done.
Resolving deltas: 100% (11931/11931), done.
Updating files: 100% (13095/13095), done.

4. Authenticate with "Snyk CLI" as shown below

Note: The following link will get the CLI installed for you

https://support.snyk.io/hc/en-us/articles/360003812538-Install-the-Snyk-CLI

$ snyk auth

5. Let's check the number of lines of code we scanning as follows. Here we are making sure we only scan Java Source files as that's all that exists in this repo in any case. Snyk Code supports other programming languages as well like Python, JavaScript etc.

$ cd CWE-Juliet-TestSuite-Java
$ find ./Java/src -name "*.java" -type f -exec wc -l {} \; | awk '{total += $1} END{print total}'
2,479,301

So we have close 2.5 million lines of code here of course that counts lines of empty space such as carriage returns BUT still that's lots of code to scan.

6. Run your "Snyk Code" Test as follows to see two things , how long the scan took and the number of results returned. From this we can see the scan and results took less then 2 minutes!

$ time snyk code test ./Java/src

....

 ✗ [High] SQL Injection
     Path: testcases/CWE89_SQL_Injection/s02/CWE89_SQL_Injection__getCookies_Servlet_execute_41.java, line 42
     Info: Unsanitized input from cookies flows into execute, where it is used in an SQL query. This may result in an SQL Injection vulnerability.

 ✗ [High] SQL Injection
     Path: testcases/CWE89_SQL_Injection/s02/CWE89_SQL_Injection__getCookies_Servlet_execute_14.java, line 62
     Info: Unsanitized input from cookies flows into execute, where it is used in an SQL query. This may result in an SQL Injection vulnerability.

 ✗ [High] SQL Injection
     Path: testcases/CWE89_SQL_Injection/s02/CWE89_SQL_Injection__getCookies_Servlet_execute_10.java, line 62
     Info: Unsanitized input from cookies flows into execute, where it is used in an SQL query. This may result in an SQL Injection vulnerability.


✔ Test completed

Organization:      undefined
Test type:         Static code analysis
Project path:      ./Java/src

3086 Code issues found
1491 [High]  1595 [Medium]


real 1m4.269s
user 0m45.630s
sys 0m7.182s

The following post shows some comparison data against other SAST engines and code repositories.

https://snyk.io/blog/sast-tools-speed-comparison-snyk-code-sonarqube-lgtm/

More Information
Snyk Code

Snyk Code is now available for free
Categories: Fusion Middleware

Snyk provides native integration for Atlassian Bitbucket Cloud security - Here is how!!!

Pas Apicella - Thu, 2021-07-22 23:56

The Snyk security integration is free and easy to set up with just a few clicks inside the Bitbucket Cloud product. For the first time, developers can consume information that was previously only available inside Snyk now within Bitbucket Cloud. Snyk enables developers to see new vulnerabilities as they emerge and implement fixes early and quickly in the process. 

In this post we show how easily you can integrate Snyk into Bitbucket Cloud

Steps
Note: For the following to work you have to integrate Bitbucket Cloud with the Snyk App as per this link 


1. Once logged into Bitbucket Cloud navigate to your "Workplace Settings" and authenticate with Snyk as shown below.





2. Next select the repository you wish to use as shown below.





3. You should see a "Snyk" tab on the left hand side click on this and then click on "Import this repository" as shown below.




4. In a couple of minutes or less you should see a "Snyk" overview report as per below. 




5. Click on "pom.xml" to get more information as shown below. Here we get a list of all Vulnerabilities as per a scan of the package manifest file "pom.xml" in this example.





6. On the top of this page you can directly go to the project page on "Snyk App" by clicking on "Visit Snyk" as shown below.




It's as simple as that!
 
More Information
Demo Video

Categories: Fusion Middleware

Snyk Import Project API using Azure Repos

Pas Apicella - Thu, 2021-07-01 20:43

In this example below we show how you could import a Azure Repo directly from the Snyk API. The Snyk API is available to customers on paid plans and allows you to programatically integrate with Snyk.

API url

The base URL for all API endpoints is https://snyk.io/api/v1/

Authorization

To use this API, you must get your token from Snyk. It can be seen on https://snyk.io/account/ after you register with Snyk and login.

The token should be supplied in an Authorization header with the token, preceded by token:

Authorization: token API_KEY

Steps

Note: You would have configured an Azure Repos Integration as shown below prior to running these steps




1. Authenticate with the CLI using your Snyk Token as follows

$ snyk auth TOKEN

Your account has been authenticated. Snyk is now ready to be used.

2. Log into your Azure DevOps account and verify which Project Repo you wish to import as per the image below


3. With your project selected you will need the following details to perform the Snyk API import request

owner: Name of your project


name: Name of your Repository


branch: Branch name you wish to import



4. Finally we will need our Organisation ID and Azure Repos ID which we retrieve from the Snyk UI as follows

organization ID

Setting > General -> Organization ID

Azure Repos ID

Setting > Integrations -> Azure Repos -> Edit Settings -> Scroll to bottom of page



Take of note of both ID's we will need them in the steps below

5. Now we are ready to import our Azure Repo using a command as follows. We are using a "curl" command to issue a POST request and this command is for a Mac OSX or linux system so it would need tweaking if running on Windows for example

Command

curl --include \
     --request POST \
     --header "Content-Type: application/json; charset=utf-8" \
     --header "Authorization: token `snyk config get api`" \
     --data-binary "{
  \"target\": {
    \"owner\": \"spring-book-service\",
    \"name\": \"spring-book-service\",
    \"branch\": \"master\"
  }
}" \
'https://snyk.io/api/v1/org/$ORG_ID/integrations/$AZURE_REPO_ID/import'

6. Set the ENV variables for ORG_ID and AZURE_REPO_ID as follows

export ORG_ID=FROM_STEP-4
export AZURE_REPO_ID=FROM_STEP-4

7. Run command

$ curl --include \
>      --request POST \
>      --header "Content-Type: application/json; charset=utf-8" \
>      --header "Authorization: token `snyk config get api`" \
>      --data-binary "{
>   \"target\": {
>     \"owner\": \"spring-book-service\",
>     \"name\": \"spring-book-service\",
>     \"branch\": \"master\"
>   }
> }" \
> "https://snyk.io/api/v1/org/${ORG_ID}/integrations/${AZURE_REPO_ID}/import"
HTTP/2 201
content-security-policy: base-uri 'none'; script-src 'self' https: 'nonce-ENlk6rSQsdLgbcWNcCaA7A==' 'unsafe-inline' 'unsafe-eval' 'strict-dynamic' 'report-sample'; img-src https: data:; object-src 'none'; report-to csp-report-group; report-uri https://web-reports.snyk.io/csp?version=39e8721a74a6dffcb97fc790cfbf1fca91cefc03;
report-to: {"group":"csp-report-group","max_age":1800,"endpoints":[{"url":"https://web-reports.snyk.io/csp?version=39e8721a74a6dffcb97fc790cfbf1fca91cefc03"}],"include_subdomains":true}
x-snyk-version: undefined
snyk-request-id: 1669e85e-abe5-401b-80bb-dae41829d6e1
location: https://snyk.io/api/v1/org/.....
content-type: application/json; charset=utf-8
content-length: 2
etag: W/"2-vyGp6PvFo4RvsFtPoIWeCReyIC8"
date: Fri, 02 Jul 2021 01:39:45 GMT
x-frame-options: deny
x-content-type-options: nosniff
x-xss-protection: 1; mode=block
strict-transport-security: max-age=31536000; preload

{}

Finally switch back to the Snyk UI and verify you have imported the Azure repo as shown below


More Information
Import Projects API


Categories: Fusion Middleware

Basic Pipeline using Snyk Container, OCI Images, Azure DevOps all part of Cloud Native Application Security

Pas Apicella - Mon, 2021-06-14 00:20

Snyk Container will find vulnerabilities in containers and Kubernetes workloads throughout the SDLC by scanning any compliant OCI image which includes those created by Cloud Native Buildpacks or other build tools that create OCI images.

So what could an Azure DevOps Pipeline look like that incorporates the following using Snyk?

Running a Snyk Scan against the project repository
Here we run a "snyk test" from the root folder of the repository itself and that report is then



Building your Artifact
Here we use a Maven task which packages the application Artifact as a JAR file ready to run




Creating an OCI compliant container image from the Artifact itself
There are various ways to create a OCI compliant image but by the far the simplest is using Cloud Native Buildpacks and for this we use the pack CLI which in turns using the Java Buildpack from our JAR file directly avoid a compilation step from the source code given we already did that on the step above.



Running a Snyk Scan against the container image directly on the Container Registry
With our container image now in our Container Registry we can use "snyk container" to check for issues directly from the registry and also check for application security issues from the open source dependancies as well.





The finished Pipeline ...







azure-pipeline.yml Pipeline used in Azure DevOps

# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml

trigger:
- master

pool:
vmImage: ubuntu-latest

steps:

- task: Maven@3
inputs:
mavenPomFile: 'pom.xml'
mavenOptions: '-Xmx3072m'
javaHomeOption: 'JDKVersion'
jdkVersionOption: '1.11'
jdkArchitectureOption: 'x64'
publishJUnitResults: true
options: "-Dskiptests -Dsnyk.skip"
goals: 'package'
displayName: "Build artifact JAR"
- task: SnykSecurityScan@0
inputs:
serviceConnectionEndpoint: 'snyk-token'
testType: 'app'
monitorOnBuild: false
failOnIssues: false
displayName: "snyk test from source"
- task: Docker@2
inputs:
containerRegistry: 'docker-pasapples-connection'
command: 'login'
displayName: "Login to DockerHub"

- script: |
curl -sSL "https://github.com/buildpacks/pack/releases/download/v0.19.0/pack-v0.19.0-linux.tgz" | tar -C ./ --no-same-owner -xzv pack
./pack build pasapples/springbootemployee:cnb-paketo-base --builder paketobuildpacks/builder:base --publish --path ./target/springbootemployee-0.0.1-SNAPSHOT.jar
displayName: 'Build Container with Pack'

- task: SnykSecurityScan@0
inputs:
serviceConnectionEndpoint: 'snyk-token'
testType: 'container'
dockerImageName: 'pasapples/springbootemployee:cnb-paketo-base'
severityThreshold: 'low'
monitorOnBuild: false
failOnIssues: false
additionalArguments: "--app-vulns"
displayName: "snyk container scan from image"


More Information

So, for Container and Kubernetes security, designed to help developers find and fix vulnerabilities in cloud native applications, click the links below to learn more and get started today.

Snyk Container

Snyk Platform

Categories: Fusion Middleware

Installing Snyk Controller into a k3d kubernetes cluster to enable runtime container scanning with the Snyk Platform

Pas Apicella - Thu, 2021-06-03 00:16

Snyk integrates with Kubernetes, enabling you to import and test your running workloads and identify vulnerabilities in their associated images and configurations that might make those workloads less secure. Once imported, Snyk continues to monitor those workloads, identifying additional security issues as new images are deployed and the workload configuration changes

In the example below we show you how easy it is to integrate the Snyk Platform with any K8s distribution in this case k3d running on my laptop.

Steps 

1. Install k3d using the instructions from the link below.

https://k3d.io/

2. Create a cluster as shown below.

pasapicella@192-168-1-113:~/snyk/demos/kubernetes/k3d$ k3d cluster create snyk-k3d --servers 1 --agents 2
INFO[0000] Prep: Network
INFO[0003] Created network 'k3d-snyk-k3d'
INFO[0003] Created volume 'k3d-snyk-k3d-images'
INFO[0004] Creating node 'k3d-snyk-k3d-server-0'
INFO[0005] Creating node 'k3d-snyk-k3d-agent-0'
INFO[0005] Creating node 'k3d-snyk-k3d-agent-1'
INFO[0005] Creating LoadBalancer 'k3d-snyk-k3d-serverlb'
INFO[0005] Starting cluster 'snyk-k3d'
INFO[0005] Starting servers...
INFO[0005] Starting Node 'k3d-snyk-k3d-server-0'
INFO[0012] Starting agents...
INFO[0012] Starting Node 'k3d-snyk-k3d-agent-0'
INFO[0023] Starting Node 'k3d-snyk-k3d-agent-1'
INFO[0031] Starting helpers...
INFO[0031] Starting Node 'k3d-snyk-k3d-serverlb'
INFO[0033] (Optional) Trying to get IP of the docker host and inject it into the cluster as 'host.k3d.internal' for easy access
INFO[0036] Successfully added host record to /etc/hosts in 4/4 nodes and to the CoreDNS ConfigMap
INFO[0036] Cluster 'snyk-k3d' created successfully!
INFO[0036] --kubeconfig-update-default=false --> sets --kubeconfig-switch-context=false
INFO[0036] You can now use it like this:
kubectl config use-context k3d-snyk-k3d
kubectl cluster-info

3. View the Kubernetes nodes.

$ kubectl get nodes
NAME                    STATUS   ROLES                  AGE   VERSION
k3d-snyk-k3d-server-0   Ready    control-plane,master   21h   v1.20.5+k3s1
k3d-snyk-k3d-agent-0    Ready    <none>                 21h   v1.20.5+k3s1
k3d-snyk-k3d-agent-1    Ready    <none>                 21h   v1.20.5+k3s1

4.  Run the following command in order to add the Snyk Charts repository to Helm.

$ helm repo add snyk-charts https://snyk.github.io/kubernetes-monitor/
"snyk-charts" already exists with the same configuration, skipping

5. Once the repository is added, create a unique namespace for the Snyk controller:

$ kubectl create namespace snyk-monitor

6. Now, log in to your Snyk account and navigate to Integrations. Search for and click Kubernetes. Click Connect from the page that loads, copy the Integration ID. The Snyk Integration ID is a UUID, similar to this format: abcd1234-abcd-1234-abcd-1234abcd1234. Save it for use from your Kubernetes environment in the next step

Instructions link : https://support.snyk.io/hc/en-us/articles/360006368657-Viewing-your-Kubernetes-integration-settings

7. Snyk monitor runs by using your Snyk Integration ID, and using a dockercfg file. If you are not using any private registries which we are not in this demo, create a Kubernetes secret called snyk-monitor containing the Snyk Integration ID from the previous step and run the following command:

$ kubectl create secret generic snyk-monitor -n snyk-monitor \
         --from-literal=dockercfg.json={} \
         --from-literal=integrationId=INTEGRATION_TOKEN_FROM_STEP_6
secret/snyk-monitor created

8. Install the Snyk Helm chart as follows:

$ helm upgrade --install snyk-monitor snyk-charts/snyk-monitor \
                          --namespace snyk-monitor \
                          --set clusterName="k3d Dev cluster"
Release "snyk-monitor" does not exist. Installing it now.
NAME: snyk-monitor
LAST DEPLOYED: Wed Jun  2 17:47:13 2021
NAMESPACE: snyk-monitor
STATUS: deployed
REVISION: 1
TEST SUITE: None

9. Verify the Snyk Controller is running using either 

$ kubectl get pods -n snyk-monitor
NAME                           READY   STATUS    RESTARTS   AGE
snyk-monitor-64c94685b-fwpvx   1/1     Running   3          21h

10. At this point we can create some workloads as follows let's just add a single POD to the cluster for a basic Spring Boot application.

$ kubectl run springboot-app --image=pasapples/spring-boot-jib --port=8080
pod/springboot-app created

11. Head back to the Snyk Dashboard and click on your Kubernetes Integration Tile and you should see a list of applicable workloads to monitor in our case  we just have the single app called "springboot-app".



12. Add the selected workload and your done!!!



More Information

Install the Snyk controller with Helm

Categories: Fusion Middleware

Cloud Native Buildpacks meets Snyk Container

Pas Apicella - Thu, 2021-05-27 07:35

Cloud Native Buildpacks transform your application source code into images that can run on any cloud and avoid ever having a write Dockerfile again after all why would you even care about a Dockerfile?

So in today's post we are going to take a look at how Snyk App with Snyk Container can scan those Cloud Native Buildpack OCI images for you.  

Snyk Container equips developers to quickly fix container issues. Use the following to find out more about Snyk Container - https://snyk.io/product/container-vulnerability-management/

You may not always have access to the original source code that runs in your containers, but vulnerabilities in your code dependencies are still important. Snyk can detect and monitor open source dependencies for popular languages as part of the container scan which is important because Cloud Native Buildpacks are building container images from source code using popular programming languages.

Demo

1. First let's clone spring petclinic app and create our artifact for deployment.

$ git clone https://github.com/spring-projects/spring-petclinic
$ cd spring-petclinic
$ /mvnw package 

2. Install pack so we can build some OCI images directly into our registries from a simple CLI. Instructions as follows


3. Test kpack is installed by listing the suggested builders

$ pack --version
0.18.1+git-b5c1a96.build-2373

$ pack builder suggest
Suggested builders:
Google:                       gcr.io/buildpacks/builder:v1      
Heroku:                       heroku/buildpacks:18              
Heroku:                       heroku/buildpacks:20      
Paketo Buildpacks:     paketobuildpacks/builder:base    
Paketo Buildpacks:     paketobuildpacks/builder:full    
Paketo Buildpacks:     paketobuildpacks/builder:tiny

At this point we are ready to go so let's create 3 OCI images and push them to 3 separate regitries as follows

  • Dockerhub
  • Amazon Elastic Container Registry (ECR)
  • Google Cloud Registry (GCR)

Note: Your local docker desktop needs to be able to establish connections to the 3 registries above

4. Build an OCI image of petclinic to Dockerhub.

$ pack build pasapples/petclinic:latest --builder paketobuildpacks/builder:base --publish --path ./spring-petclinic/target/spring-petclinic-2.4.5.jar
base: Pulling from paketobuildpacks/builder
Digest: sha256:a6f81cb029d4d3272981c12dad7212a5063ec0076e2438b7b5bb702f2e1fd11a
Status: Image is up to date for paketobuildpacks/builder:base
===> DETECTING
5 of 18 buildpacks participating
paketo-buildpacks/ca-certificates   2.2.0
paketo-buildpacks/bellsoft-liberica 8.0.0

...

Adding label 'org.opencontainers.image.version'
Adding label 'org.springframework.boot.version'
Setting default process type 'web'
Saving pasapples/petclinic:latest...
*** Images (sha256:d730612833826cd9e39a7241c1fba411eacd9b5f771915b00af4b3b499838bd5):
      pasapples/petclinic:latest
Successfully built image pasapples/petclinic:latest


5. Build an OCI image of petclinic to ECR.

$ pack build 300326902600.dkr.ecr.us-east-1.amazonaws.com/petclinic:latest --builder paketobuildpacks/builder:base --publish --path ./spring-petclinic/target/spring-petclinic-2.4.5.jar
base: Pulling from paketobuildpacks/builder
Digest: sha256:a6f81cb029d4d3272981c12dad7212a5063ec0076e2438b7b5bb702f2e1fd11a
Status: Image is up to date for paketobuildpacks/builder:base
===> DETECTING
5 of 18 buildpacks participating
paketo-buildpacks/ca-certificates   2.2.0
paketo-buildpacks/bellsoft-liberica 8.0.0

...

Adding label 'org.opencontainers.image.version'
Adding label 'org.springframework.boot.version'
Setting default process type 'web'
Saving 300326902600.dkr.ecr.us-east-1.amazonaws.com/petclinic:latest...
*** Images (sha256:d730612833826cd9e39a7241c1fba411eacd9b5f771915b00af4b3b499838bd5):
      300326902600.dkr.ecr.us-east-1.amazonaws.com/petclinic:latest
Successfully built image 300326902600.dkr.ecr.us-east-1.amazonaws.com/petclinic:latest


6. Build an OCI image of petclinic to GCR.

$ pack build us.gcr.io/snyk-cx-se-demo/petclinic-google:latest --builder paketobuildpacks/builder:base --publish --path ./spring-petclinic/target/spring-petclinic-2.4.5.jar
base: Pulling from paketobuildpacks/builder
Digest: sha256:a6f81cb029d4d3272981c12dad7212a5063ec0076e2438b7b5bb702f2e1fd11a
Status: Image is up to date for paketobuildpacks/builder:base
===> DETECTING
5 of 18 buildpacks participating
paketo-buildpacks/ca-certificates   2.2.0
paketo-buildpacks/bellsoft-liberica 8.0.0

...

Adding label 'org.opencontainers.image.version'
Adding label 'org.springframework.boot.version'
Setting default process type 'web'
Saving us.gcr.io/snyk-cx-se-demo/petclinic-google:latest...
*** Images (sha256:d730612833826cd9e39a7241c1fba411eacd9b5f771915b00af4b3b499838bd5):
      us.gcr.io/snyk-cx-se-demo/petclinic-google:latest
Successfully built image us.gcr.io/snyk-cx-se-demo/petclinic-google:latest


Three OCI compliant images of petclinic in our 3 registries done!!!

So now it's over to Snyk.

7. Login Snyk App - https://app.snyk.io 

8. Lets do a scan from the CLI by first authenticating with Snyk App as follows which will direct you to a browser to authenticate once done return to the prompt.

$ snyk auth 

9. Now you can run a snyk container test to any of the 3 registries we used as follows.

$ snyk container test us.gcr.io/snyk-cx-se-demo/petclinic-google:latest

$ snyk container test 300326902600.dkr.ecr.us-east-1.amazonaws.com/petclinic:latest

$ snyk container test pasapples/petclinic:latest

Demo Output

Testing us.gcr.io/snyk-cx-se-demo/petclinic-google:latest...

✗ Low severity vulnerability found in shadow/passwd
  Description: Time-of-check Time-of-use (TOCTOU)
  Info: https://snyk.io/vuln/SNYK-UBUNTU1804-SHADOW-306209
  Introduced through: shadow/passwd@1:4.5-1ubuntu2, apt@1.6.13, shadow/login@1:4.5-1ubuntu2
  From: shadow/passwd@1:4.5-1ubuntu2
  From: apt@1.6.13 > adduser@3.116ubuntu1 > shadow/passwd@1:4.5-1ubuntu2
  From: shadow/login@1:4.5-1ubuntu2

...

✗ Medium severity vulnerability found in gcc-8/libstdc++6
  Description: Information Exposure
  Info: https://snyk.io/vuln/SNYK-UBUNTU1804-GCC8-572149
  Introduced through: gcc-8/libstdc++6@8.4.0-1ubuntu1~18.04, apt/libapt-pkg5.0@1.6.13, apt@1.6.13, meta-common-packages@meta
  From: gcc-8/libstdc++6@8.4.0-1ubuntu1~18.04
  From: apt/libapt-pkg5.0@1.6.13 > gcc-8/libstdc++6@8.4.0-1ubuntu1~18.04
  From: apt@1.6.13 > gcc-8/libstdc++6@8.4.0-1ubuntu1~18.04
  and 2 more...

Organization:      pas.apicella-41p
Package manager:   deb
Project name:      docker-image|us.gcr.io/snyk-cx-se-demo/petclinic-google
Docker image:      us.gcr.io/snyk-cx-se-demo/petclinic-google:latest
Platform:          linux/amd64
Licenses:          enabled

Tested 97 dependencies for known issues, found 25 issues.

10. Back to Snyk App we can import all 3 OCI images from all 3 registries once we configure each integration for the registries as shown below





11. And take a look at where the vulnerabilities exist within those open source dependencies used in our petclinic source code as well as base image layer vulnerabilities







Give Snyk App a go yourself and start scanning those OCI container images built using Cloud Native Buildpacks now!!!

More Information
Main Snyk Web Page

Snyk Container
Categories: Fusion Middleware

Murder in the Age of Enlightenment

Greg Pavlik - Fri, 2021-04-09 17:51

I had a few days of downtime to deal with some medical issues and turned to some short story collections to fill the time. My companions for a bit were Ryūnosuke Akutagawa and Anton Chekhov. I was quite delighted with a new translation of Akutagawa from Pushkin Press, Murder in the Age of Enlightenment. What sparse but sharp imagery - taken from Japanese history, European literature, Mahayana Buddhism, Christianity, Chinese writings - it was a bit of a smorgasbord. Akutagawa can be dark: his preoccupation with suicide in his writing no doubt reflected in his own suicide at age 35; I found his piece Madonna in Black on a peculiarly evil Maria-Kannon to be troubling, not least because I have a kind of devotional fascination with Maria-Kannon as our Lady of Mercy. But still Akutagawa is deeply humanistic and wide-ranging. The Karetnyk translation can be digested in an afternoon, no doubt time well spent.

My Chekhov choice was the recent translation of fifty-two stories by the unsurpassable translator pair Richard Pevear and Larissa Volokhonsky. These two are artists in their own right... I can't say enough good things about their portfolio of translations. They are so good I've been forced to re-read a number of novels just to digest their interpretative readings over the years.

But back to Akutagawa. Here I post a translation done under Creative Commons license* of the story The Spider's Thread. I don't know if this is a re-telling of Dostoevsky's "Tale of the Onion" in Karamazov for sure, though the story line is so close that I find it impossible to believe otherwise: Lord Buddha Shakyamuni simply replacing the Guardian Angel. Get the Pushkin Press book to read it in a slightly more refined form, but I found this a wonderful read as well:


ONE


One day, the Buddha was strolling alone along the edge of a lotus pond in Paradise. The blooming lotus flowers in the pond were each pure white like jewels, and the place was filled with the indescribably wondrous fragrance continually emitted from each flower’s golden center. It was just morning in Paradise.

After a time, the Buddha paused at the edge of the pond and from between the lotus leaves that covered it saw a glimpse of the state of things below. Now this celestial pond just happened to lie directly over Hell, and peering through that crystal-clear water was like looking through a magnifying glass at the River of Death and the Mountain of Needles and such.

The Buddha saw there, in the depths of Hell, a single man writhing along with the other sinners. This man was named Kandata, and he had been a notorious thief who had performed murder and arson and other acts of evil. In his past, however, he had performed just one good deed: one day, when walking through the deep forest, he saw a spider crawling along the road. At first he raised his foot to crush it, but suddenly he changed his mind and stopped, saying, “No, small though it may be, a spider, too, has life. It would be a pity to meaninglessly end it,” and so did not kill it.

Looking down upon the captives in Hell the Buddha recalled this kind act that Kandata had performed, and thought to use his good deed as a way to save him from his fate. Looking aside, there on a jade-colored lotus leaf he saw a single spider, spinning out a web of silver thread. The Buddha carefully took the spider’s thread into his hand, and lowered it straight down between the jewel-like white lotuses into the depths of Hell.


TWO


Kandata was floating and sinking along with the other sinners in the Lake of Blood at the bottom of Hell. It was pitch black no matter which way he looked, and the occasional glimpse of light that he would see in the darkness would turn out to be just the glint of the terrible Mountain of Needles. How lonely he must have felt! All about him was the silence of the grave, the only occasional sound being a faint sigh from one of the damned. Those who were so evil as to be sent to this place were tired by its various torments, and left without even the strength to cry out. Even the great thief Kandata could only squirm like a dying frog as he choked in the Lake of Blood.

But one day, raising up his head and glancing at the sky above the lake, in the empty darkness Kandata saw a silver spider’s thread being lowered from the ceiling so far, far away. The thread seemed almost afraid to be seen, emitting a frail, constant light as it came down to just above Kandata’s head. Seeing this, Kandata couldn’t help but clap his hands in joy. If he were to cling to this thread and climb up it, he may be able to climb out of Hell! Perhaps he could even climb all the way to Paradise! Then he would never be chased up the Mountain of Needles, nor drowned in the Lake of Blood again.

Thinking so, he firmly grasped the spider’s thread with both hands and began to climb the thread, higher and higher. Having once been a great thief, he was used to tasks such as this. But the distance between Hell and Paradise is tens of thousands of miles, and so it would seem that no amount of effort would make this an easy journey. After climbing for some time Kandata tired, and couldn’t climb a bit higher. Having no other recourse, he hung there from the thread, resting, and while doing so looked down below.

He saw that he had made a good deal of progress. The Lake of Blood that he had been trapped in was now hidden in the dark below, and he had even climbed higher than the dimly glowing Mountain of Needles. If he could keep up this pace, perhaps he could escape from Hell after all. Kandata grasped the thread with both hands, and laughingly spoke in a voice that he hadn’t used in the many years since he had come here, “I’ve done it! I’ve done it!”

Looking down, however, what did he see but an endless queue of sinners, intently following him up the thread like a line of ants! Seeing this, surprise and fear kept Kandata hanging there for a time with mouth open and eyes blinking like a fool. How could this slender spider’s web, which should break even under just his weight, support the weight of all these other people? If the thread were to snap, all of his effort would be wasted and he would fall back into Hell with the others! That just would not do. But even as he thought these thoughts, hundreds more, thousands more of the damned came crawling up from the Lake of Blood, forming a line and scurrying up the thread. If he didn’t do something fast, surely the thread would snap in the middle and he would fall back down.

Kandata shouted out, “Hey! You sinners! This thread is mine! Who said you could climb up it? Get off! Get off!”

Though the thread had been fine until just then, with these words it snapped with a twang right where Kandata held it. Poor Kandata fell headfirst through the air, spinning like a top, right down through the darkness. The severed end of the silver thread hung there, suspended from heaven, shining with its pale light in that moonless, starless sky.


THREE


The Buddha stood in Paradise at the edge of the lotus pond, silently watching these events. After Kandata sank like a stone to the bottom of the Lake of Blood, he continued his stroll with a sad face. He must have been surprised that even after such severe punishment Kandata’s lack of compassion would lead him right back into Hell.

Yet the lotus blossoms in the lotus ponds of Paradise care nothing about such matters. Their jewel-like white flowers waved about the feet of the Buddha, and each flower’s golden center continuously filled the place with their indescribably wondrous fragrance. It was almost noon in Paradise.


(16 April 1918)

* Translation http://tonygonz.blogspot.com/2006/05/spiders-thread-akutagawa-ryunosuke.html

Silence in 4 Movements

Greg Pavlik - Wed, 2021-03-24 12:05

"What is the relation of [contemplation] to action? Simply this. He who attempts to act and do things for others or for the world without deepening his own self-understanding, freedom, integrity and capacity to love will not have anything to give others. He will communicate to them nothing but the contagion of his own obsessions, his aggressiveness, his ego-centered ambitions, his delusions about ends and means, his doctrinaire prejudices and ideas."
—Thomas Merton


"Those who know do not talk.

Those who talk do not know.

~

Stop talking,

block off your senses,

meditate in silence,

release your worries,

blunt your sharpness,

untie your knots,

soften your glare,

harmonise your inner light

and unite the world into one whole!

This is the primal union or secret embrace."

Tao Te Ching 56

"Make stillness your criterion for testing the value of everything, and choose always what contributes to it."

-Evagrius Ponticus




"Silence and Beauty - Eco" (Minerals and gesso on canvas, 2016) by contemporary Japanese American artist Fujimura Makoto (藤村真, born in Boston, Massachusetts in 1960). Abstract expressionist piece done with nihonga techniques. Picture found online.

I read Fujimura's book Silence and Beauty last year, which was inspired by the Endo Shusaku's 20th century novel Silence (itself adapted to film by the great Martin Scorsese). Fujimura reflects on his relationship with Japanese culture in the light of Shusaku's work, the Hiroshima bombing, and his own experience as a Japanese-American: most importantly how it has manifested in his work as an artist. Shusaku's work itself dwells on "silence" as absence. But I think this painting shows absence-as-presence: something is there, something beautiful, but its not clear what or even why - in fact that presence changes over time for the viewer, depending on vantage point or even focus.

Elastic Cloud with Observability 7.11 Using Rancher k3d for K8s

Pas Apicella - Thu, 2021-02-25 03:02

Started working locally with rancher's k3d to spin up K8s clusters so I thought why not use Elastic Cloud with Elastic Observability to monitor this local K8s cluster and even run container workloads that use Elastic APM to monitor the application through tracing.

A successful Kubernetes monitoring solution has a few requirements:
  • Monitors all layers of your technology stack, including:
    • The host systems where Kubernetes is running.
    • Kubernetes core components, nodes, pods, and containers running within the cluster.
    • All of the applications and services running in Kubernetes containers.
  • Automatically detects and monitors services as they appear dynamically.
  • Provides a way to correlate related data so that you can group and explore related metrics, logs, and other observability data.

Some basic steps to get this running as per below. This was tested with Elastic Stack 7.11 using the Elastic Cloud Service

Pre Steps

1. I have my Elastic stack running using Elastic Cloud as per the screen shot below

2. I have k3d installed on my Mac as follows

$ k3d --version
k3d version v4.0.0
k3s version latest (default)

3. You will need the kubectl CLI as well

Steps

1. First let's start a K8s cluster as follows. I have created a 5 node K8s cluster with 1 master node but you can reduce the number of worker nodes if you don't have the memory to support this

$ k3d cluster create elastic-k3d --servers 1 --agents 5

2. Start your k3d cluster as follows if not already started

$ k3d cluster start elastic-k3d
INFO[0000] Starting cluster 'elastic-k3d'
INFO[0000] Starting Node 'k3d-elastic-k3d-agent-4'
INFO[0000] Starting Node 'k3d-elastic-k3d-agent-3'
INFO[0000] Starting Node 'k3d-elastic-k3d-agent-2'
INFO[0001] Starting Node 'k3d-elastic-k3d-agent-1'
INFO[0001] Starting Node 'k3d-elastic-k3d-agent-0'
INFO[0001] Starting Node 'k3d-elastic-k3d-server-0'
INFO[0002] Starting Node 'k3d-elastic-k3d-serverlb'

3. Let's list out our nodes just to verify what you created

$ k get nodes
NAME                       STATUS   ROLES                  AGE   VERSION
k3d-elastic-k3d-server-0   Ready    control-plane,master   34h   v1.20.2+k3s1
k3d-elastic-k3d-agent-4    Ready    <none>                 34h   v1.20.2+k3s1
k3d-elastic-k3d-agent-1    Ready    <none>                 34h   v1.20.2+k3s1
k3d-elastic-k3d-agent-3    Ready    <none>                 34h   v1.20.2+k3s1
k3d-elastic-k3d-agent-2    Ready    <none>                 34h   v1.20.2+k3s1
k3d-elastic-k3d-agent-0    Ready    <none>                 34h   v1.20.2+k3s1

Now it's time to "Monitor Kubernetes: Observe the health and performance of your Kubernetes deployments" To do that we could just follow this tutorial as shown below

https://www.elastic.co/guide/en/observability/7.11/monitor-kubernetes.html

This tutorial will walk you through how to do the following

Deploy filebeat into your k3d cluster
Deploy metricbeat into your k3d cluster
Deploy an application which is instrumented using the Elastic APM agent 

4. When installing filebeat and metricbeat make sure you add your Elastic Cloud credentials which can be obtained from the Elastic Cloud deployments page and would have also been provided as a XLS file to download once you create your deployment

filebeat-kubernetes.yaml

    processors:
      - add_cloud_metadata:
      - add_host_metadata:

    cloud.id: pas-K8s:{PASSWORD}
    cloud.auth: elastic:{PASSWORD}

metricbeat-kubernetes.yaml (Notice how I have added a kubernetes_metadata processor)

    processors:
      - add_host_metadata:
      - add_kubernetes_metadata:

    cloud.id: pas-K8s:{PASSWORD}
    cloud.auth: elastic:{PASSWORD}

5. Once installed filebeat and metricbeat PODS should be running as per the output below

Filebeat

$ kubectl get pods -n kube-system -l k8s-app=filebeat
NAME             READY   STATUS    RESTARTS   AGE
filebeat-m8s6s   1/1     Running   2          32h
filebeat-g86vf   1/1     Running   2          32h
filebeat-bj548   1/1     Running   2          32h
filebeat-5cpcn   1/1     Running   2          32h
filebeat-nwj2h   1/1     Running   2          32h
filebeat-4hs8j   1/1     Running   2          32h

Metricbeat

$ kubectl get pods -n kube-system -l k8s-app=metricbeat
NAME               READY   STATUS    RESTARTS   AGE
metricbeat-pbfvs   1/1     Running   2          32h
metricbeat-v5n8l   1/1     Running   2          32h
metricbeat-cdfz9   1/1     Running   2          32h
metricbeat-z85g8   1/1     Running   2          32h
metricbeat-4fxhh   1/1     Running   2          32h
metricbeat-g25lh   1/1     Running   2          32h

6. At this point it's worth heading to Kibana Obserability page and from there you will see we have logs and metrics from our local k3d K8s cluster as shown below


7. Now click on Metrics view and you can clearly see each of your k3d K8s nodes and by clicking on any node you get a very convenient view of metrics, process, metadata and more all from a single page. This was one of the new Elastic 7.11 features for Observability



8. Finally to deploy a application to this K8s cluster and have Elastic APM instrument it's activity you can either use the example in the tutorial link provided above OR just follow this very simple example to get this done

https://github.com/papicella/elastic-customer-api-rest



More Information
k3d Home Page
k3d 

Elastic Cloud Service
Elastic Cloud 

Categories: Fusion Middleware

Pages

Subscribe to Oracle FAQ aggregator - Fusion Middleware